ShipTalk - SRE, DevOps, Platform Engineering, Software Delivery

DORA, DevEx, and the Role of AI with Nathen Harvey (Google Cloud)

By Harness Season 4 Episode 1

In this kickoff episode of ShipTalk Season 4, Dewan sits down with Nathen Harvey, DORA Lead and Developer Advocate at Google Cloud, to explore how DevOps metrics and AI are transforming software delivery.

From his early days in DevOps to leading the DORA initiative, Nathen shares lessons from open-source communities, the role of culture, and how AI is reshaping documentation, incident response, and even coding itself. They also dig into the 2025 DORA Survey, discuss when organizations should start tracking metrics, and what the future holds for engineering leaders.

🎙️ Your voice matters. Take the 2025 DORA survey (closes on July 18, 2025).

🎟️ Visit Harness Booth at DevOpsDays Dallas 2025 — and use the promo code SHIPTALK to take 20% off your ticket.

1
00:00:02.930 --> 00:00:22.580
Dewan Ahmed: Good morning. Good afternoon time. Time-appropriate greetings. Yesterday was July first. Canada day, July 4th is coming. So best wishes to listeners from wherever you're listening from. My name is Dewan Ahmed. And I'm your host for the ship. Talk podcast where we talk about the ins and outs

2
00:00:22.680 --> 00:00:36.519
Dewan Ahmed: ups and downs of software delivery, and I'm super pumped to kick off. Season 4 with Nathen Harvey, DORA lead and developer advocate at Google Cloud. Welcome, Nathen.

3
00:00:36.600 --> 00:00:41.730
Nathen Harvey (he/him): Dewan. Thank you so much for having me, and what an what an honor to kick off! Season 4 with you.

4
00:00:41.950 --> 00:00:57.129
Dewan Ahmed: Yeah, absolutely. 1st of all, I love the shirt. So for listeners who are listening to the podcast Nathan is wearing an avocado shirt. And if you're watching it on Youtube, you can see it. So tell us about the shirt to kick off.

5
00:00:57.170 --> 00:01:24.139
Nathen Harvey (he/him): Yeah. This shirt is kind of a signature of mine. I tend to wear it at a lot of customer meetings, conference, presentations, podcasts and so forth. I've been wearing an avocado shirt for a long, long time. It is kind of a nod to our role as developer advocates, but this shirt in particular is very special, because the people that make this shirt they've discontinued it, and I needed a new one.

6
00:01:24.740 --> 00:01:33.469
Nathen Harvey (he/him): Gleb, at harness sourced one on the aftermarket and shipped it to me, and so for that I'm forever grateful.

7
00:01:33.980 --> 00:01:58.510
Dewan Ahmed: That is fantastic. I have to ping Gleb so that I can grab a copy of that shirt. I have a pen, an avocado pen to the listener who are listening on the podcast but I have an avocado pen. And again, in reference to our role as developer, advocate. So before we actually kick off our topics? Why don't you tell us, Nathan Harvey, how did it all start? How did your journey start?

8
00:01:58.530 --> 00:02:04.370
Dewan Ahmed: And that led to Devops sre and being a developer advocate at Google Cloud.

9
00:02:04.600 --> 00:02:28.819
Nathen Harvey (he/him): Yeah, for sure. Unfortunately, I'm very old, so it's a long journey. I'll try to keep it short. Essentially, I found myself in tech in the very early days of the 1st Internet boom. So before the year 2,000, I was in tech doing a lot of different things, and I've been very lucky to have just a really great career in technology.

10
00:02:28.930 --> 00:02:53.730
Nathen Harvey (he/him): I would say that I've done everything from being a software engineer to an sre or a system administrator. I've basically held every single role across the technology sphere which has been really great. That includes things like customer support sales engineer and more. But sort of most recently, maybe, along

11
00:02:53.730 --> 00:03:17.009
Nathen Harvey (he/him): sort of the devops horizon. I was working in an organization when devops like the word 1st took hold and really started to grow in some momentum at that particular organization. At the time we were running all of our infrastructure in a data center, we were in the process of moving it all to the cloud and automating all of this. And through this automation is where I discovered chef software

12
00:03:17.010 --> 00:03:20.639
Nathen Harvey (he/him): and really a connection to open source.

13
00:03:20.690 --> 00:03:45.619
Nathen Harvey (he/him): I eventually left that company and went to chef and helped lead up the community. There I was the Vice President of Community engineering, which meant that I was responsible for our open source offerings and how our engineers work together with the open source contributors. I was responsible for developer relations and developer advocacy, working together with customers in the industry, really to help understand how we can best utilize infrastructure as

14
00:03:45.620 --> 00:03:58.100
Nathen Harvey (he/him): code to deliver what we want for our customers and for our businesses. While I was there I was very lucky to work with folks like Dr. Nicole Forsgren and Jez Humble, who later went on to found a company called Dora.

15
00:03:58.100 --> 00:04:13.799
Nathen Harvey (he/him): When I left Chef I went to Google Cloud to continue my journey as a developer advocate, helping teams and organizations get better at devops and sre practices and just in general helping them leverage technology. Shortly after I joined Google Cloud.

16
00:04:13.800 --> 00:04:35.169
Nathen Harvey (he/him): the company that Jez and Nicole and Jean had founded. Dora was acquired by Google Cloud, and so I was kind of reunited with both Nicole and Jez got to work together with them. Nicole has since left. Jez is now working on other areas within Google Cloud. And I've taken over running Dora, which has been really great.

17
00:04:36.340 --> 00:04:49.240
Dewan Ahmed: That is fantastic. And just to get to show how small the tech world is, you work with someone. You meet someone, and then down the road, you again find either somehow communicating or working together.

18
00:04:49.320 --> 00:05:09.810
Nathen Harvey (he/him): Absolutely. I think it's also a good reminder of. Not that we necessarily need it, but maybe sometimes it helps. We should be really welcoming and inclusive and work well with the other humans that we come into contact with. Because, as you said, it is a small world, even if we aren't working with them. Tomorrow we might be working with them again in the future.

19
00:05:10.070 --> 00:05:27.909
Dewan Ahmed: Absolutely couldn't agree more. Now I hear you mentioned about like open source team, open source community leading developer advocates. So you have worked with some of the best teams and open source communities out there. How do you relate some of the lessons you learned that relate to modern software delivery.

20
00:05:28.290 --> 00:05:51.940
Nathen Harvey (he/him): Yeah, it's it's that's a really good question. I think it's both a matter of experience in getting to work with all of these teams and sort of see how they're doing things and so forth. I can share lessons learned and importantly, pitfalls, pitfalls that I myself have fallen into. I'll never forget when I was 1st learning, or actually before, I was learning how to like

21
00:05:51.940 --> 00:06:06.439
Nathen Harvey (he/him): automate infrastructure with infrastructure. As code, I was working with a mentor, and I built out a bunch of things all by hand. I was very proud of my servers that I built by hand. I gave them all names. I cared for them like they were pets.

22
00:06:06.710 --> 00:06:16.340
Nathen Harvey (he/him): and and my mentor came to me, and he said, You know there are these tools that can help automate away all of that work that you've been doing, and I looked him dead in the eye. And I, said, Tom.

23
00:06:16.810 --> 00:06:36.959
Nathen Harvey (he/him): I'm too busy to automate, which, of course, is a ridiculous statement, too busy to automate. Well, if I automate I'll have more time. I'll be less busy. So then I can make additional improvements. But that that sort of phrase right there it number one. It haunts me. But number 2. It also reminds me that

24
00:06:37.210 --> 00:07:00.499
Nathen Harvey (he/him): everyone, every one of us, and certainly every every team that we're working on we get so caught up in the things that we're doing, it can sometimes be very difficult to pause. Take stock of where we are and decide, what do we want to change? How do we want to push the status quo, change the status quo and make some sort of, even if it's a small improvement in the way that we work.

25
00:07:00.580 --> 00:07:25.549
Nathen Harvey (he/him): And so I think it's stories like that that really keep me inspired and keep me going like, how do we just make a small change? I want to change everything. I know I can't do that. But can I make one small change today. And then with Dora, we have this community of practitioners and leaders that are taking insights from Dora and putting them into practice. We come together regularly as a community to learn from and share

26
00:07:25.550 --> 00:07:36.539
Nathen Harvey (he/him): with one another. And I think that's the real thing. It's not listening to a single voice. It's rather getting the collective wisdom from the community that really helps drive these sorts of improvements.

27
00:07:36.870 --> 00:07:55.920
Dewan Ahmed: That is awesome to hear. And that's a nice segue to Dora. So those listeners. So when I look for Dora on Internet, it's Dora, the explorer comes up. So why don't you tell us about what Dora is? And why has it been such a cornerstone of the Devops world.

28
00:07:56.130 --> 00:08:22.489
Nathen Harvey (he/him): Yeah, for sure. So Dora is a research program that's been running for well over a decade. Now, the research actually was originated out of puppet labs, which was in the same space as chef infrastructure as code, and it started in the early days of the Devops movement. So the folks at puppet labs really wanted to understand what's happening with devops. Is this a real thing? Where is it going? And so forth? And so they started this research program.

29
00:08:22.490 --> 00:08:28.860
Nathen Harvey (he/him): And this research is really focused. As I said earlier on, how do technology teams get better?

30
00:08:28.920 --> 00:08:47.270
Nathen Harvey (he/him): And its center of gravity is around software delivery performance software delivery performance, because certainly in the early days of devops, and unfortunately still in many organizations today, the act of taking software that was written and getting it into the hands of users

31
00:08:47.490 --> 00:09:14.299
Nathen Harvey (he/him): is full of friction and full of challenges. And the reality is that software that was written or that is written doesn't actually provide any value until it's in the hands of the users. And so this is a really critical moment of the software delivery lifecycle or the software development lifecycle. How do we get this code, this technology into the hands of users? This is where we can 1st get feedback on it and decide, what do we want to do next?

32
00:09:14.420 --> 00:09:37.920
Nathen Harvey (he/him): So the research really has this center of gravity around software delivery performance. And one of the big questions at the early onset of the research was, how do you measure software delivery performance? And this was where something that many listeners maybe have heard of the Dora 4 keys or the Dora metrics. These are those software delivery performance indicators that really tell us how we're doing with software delivery performance.

33
00:09:37.920 --> 00:10:02.850
Nathen Harvey (he/him): Those metrics focus on 2 high level concepts throughput and stability. So how much change are you able to ship through the system? How much software you're able to deliver, how fast, how safe and so forth. And then the stability of those changes. When we ship things, we want to make sure that they work when they land in the production environment. They don't cause incidents or outages, or.

34
00:10:02.850 --> 00:10:20.609
Nathen Harvey (he/him): you know, immediate human intervention. So Dora starts with a center of gravity around software delivery performance. And then the research looks both to what are the outcomes of better software delivery performance things like better organizational performance and better well-being for the people on your team.

35
00:10:20.610 --> 00:10:49.320
Nathen Harvey (he/him): But we also look at what drives software delivery performance. And we call those the capabilities and conditions, and those are things that we investigate very deeply every single year. Capabilities and conditions might include things like, are you using version control? Do you practice continuous integration? What does your change approval process look like? Do you have good documentation. What's the culture like on your team? All of these things? We try to take a really complete view of technology, driven teams and organizations.

36
00:10:49.320 --> 00:10:51.440
Nathen Harvey (he/him): I should also mention that the research

37
00:10:51.620 --> 00:11:00.090
Nathen Harvey (he/him): is fully program and platform agnostic. It's not about specific tools, but rather those capabilities and conditions that you need as a team.

38
00:11:01.000 --> 00:11:22.159
Dewan Ahmed: That is great to hear. So you talked about the high performing teams. You talked about the 4 kpis I want to touch on the often forgotten part about the culture, the culture part of devops. Because Tech has been changing massively with AI and different ways of teams working. They're working in person. Hybrid, remote.

39
00:11:22.290 --> 00:11:33.300
Dewan Ahmed: Are we seeing teams now being more aligned to the business needs? How has culture, cultural misalignment have have changed over the last decade or so.

40
00:11:33.570 --> 00:11:57.250
Nathen Harvey (he/him): Yeah, throughout the research program, we find that culture is one of the biggest predictors of those performance metrics and the outcomes that we really care about, and we have used sort of ways to assess culture from a number of different sources. Primarily, though, we use a cultural assessment that was developed by Dr. Ron. Westrom, who's a sociologist. He created this

41
00:11:57.250 --> 00:12:03.839
Nathen Harvey (he/him): topology of organizational cultures that really looks at the 3 different types of organizational cultures that you might encounter.

42
00:12:03.840 --> 00:12:27.259
Nathen Harvey (he/him): He calls those cultures the pathological, the bureaucratic, and the generative cultures. As you might imagine, we see the best performance happening in generative cultures. And what are the characteristics of a generative culture? Well, there's a lot of high levels of collaboration and information sharing across teams and across the organization when something goes wrong.

43
00:12:27.480 --> 00:12:49.589
Nathen Harvey (he/him): The organization uses this as an opportunity to learn. What can we learn from this thing that went wrong, whether that's an incident or an outage in your production systems. Or maybe you missed a particular sales transaction that you were trying to accomplish like, maybe you didn't close that deal. Let's use that as an opportunity to learn and then take those lessons into the future work that we're doing.

44
00:12:49.750 --> 00:12:55.090
Nathen Harvey (he/him): And so culture plays a huge, huge role in everything that we research.

45
00:12:57.070 --> 00:13:23.020
Dewan Ahmed: So we're kicking off. Season 4 with the theme of AI AI in software delivery. And I was reading up on Dora website that the research also focuses on the intersection of AI. Could you please tell our listeners how AI is intersecting with the research that's happening at Dora, and how this would impact the engineering teams.

46
00:13:23.210 --> 00:13:52.770
Nathen Harvey (he/him): Yeah, we are super interested in understanding what are the impacts of all new things that are happening within the technology space. And you know, Dora, because it's been around for a while, was lucky enough to really be around, as a lot of organizations were moving out of data centers and into the cloud, and we got to witness that transition and research it along the way. And now we're really excited, as we sort of feel like we're at the beginning of really incorporating AI into the full software development lifecycle.

47
00:13:52.770 --> 00:14:15.150
Nathen Harvey (he/him): we get to see signals of that in the research itself. And so in March of this year, which is 2024, sorry, 2025. I don't even know which year it is. In March of 2025, we published a report, the impact of generative AI in software development. And this report leans on some of our findings from 2024 as well as some new research that we've conducted.

48
00:14:15.150 --> 00:14:27.470
Nathen Harvey (he/him): And essentially, we're trying to really understand what is the impact of adopting AI across every aspect of your team. And we see some really strong signals. There we see signals that show that

49
00:14:27.560 --> 00:14:52.310
Nathen Harvey (he/him): as you increase your AI adoption, there's a lot of good individual benefits. For example, as an individual, I might feel more productive in my job. I might spend more time in like deep work or flow, and I have higher job satisfaction. And even if we scope it out a little bit further and look at some of the team level outcomes as you increase your AI adoption. We're seeing things like documentation quality, improve

50
00:14:52.310 --> 00:15:14.740
Nathen Harvey (he/him): code, quality, improve technical debt, actually go down. All of these are really really strong signals. Like I said, we try to get a comprehensive view of how things interact. Unfortunately, one of the things that we also saw in our 2024 data was that as you increase your adoption of AI and your use of AI within your team.

51
00:15:14.950 --> 00:15:42.649
Nathen Harvey (he/him): We saw that software delivery performance actually falls off. Your throughput goes down a little bit. And then the stability of those changes actually gets even worse. So this means that we're seeing more rollbacks of deployments that are happening, or more hot fixes that have to go out as you increase your usage of AI. Now, it should be noted that that's data that we've collected in 2024. We're currently collecting data right now for 2025 to get

52
00:15:42.650 --> 00:16:02.300
Nathen Harvey (he/him): continuing updates on what are the actual impacts. And so I'll just throw in a quick, a quick note to listeners. Dora dot dev slash survey. You can go and take our survey right now. That's open until July 18.th That's going to give us real good insight into how AI is impacting your team and your organization.

53
00:16:03.640 --> 00:16:20.179
Dewan Ahmed: We'll be sure to link the link link in the video description or in the podcast description, Dora, dot dev slash survey. If you're listening before July 18, th 2025. Please take on the survey, because your feedback would be super important.

54
00:16:20.950 --> 00:16:47.719
Dewan Ahmed: One challenge for entrepreneurs or engineering leaders who are just starting a new company. So let's say they have a company of size 10, and they're wondering when is the right time to dig into the findings of Dora and implement it. Is it too soon now they have a team of 500. Is it too late? How have you seen different engineering teams adopt these best practices in their journey.

55
00:16:47.800 --> 00:17:10.020
Nathen Harvey (he/him): Yeah, that's a really good question, Duan, and I think that it really depends, of course. But I would encourage any team of any size to look at the Dora research and and sort of get an understanding of some of our findings and so forth. Unfortunately, a lot of teams find those 4 metrics, those software delivery performance metrics that we talked about. And they stop there.

56
00:17:10.020 --> 00:17:33.939
Nathen Harvey (he/him): Those software delivery performance metrics are nothing more than indicators, and they should be used by a team to help them understand. How are we doing today? And how do we compare that to how we did 6 months ago? Or how will we be doing 6 months from now? But the more important thing than those indicators or those metrics are the improvements that you're going to make. So what changes are you going to put into place? And now, if you're just getting started.

57
00:17:34.010 --> 00:17:58.989
Nathen Harvey (he/him): everything that you do is a change. So like you said, you want to start off with some good foundational practices. Dora can give you an indicator into what some of those good foundational practices are, and if you're at a team of 500 you've grown. There's certainly an area for you to improve. Dora's tagline is get better at getting better. We really want to help teams identify where they are

58
00:17:58.990 --> 00:18:07.849
Nathen Harvey (he/him): today so that they can make a decision about what to improve next. And then, like, I said, earlier, you take a small step, a small step, a small step, a small step.

59
00:18:07.850 --> 00:18:09.569
Nathen Harvey (he/him): Each one of these small steps

60
00:18:09.660 --> 00:18:33.149
Nathen Harvey (he/him): may move you forward. Of course some of them may move you backwards either way. That's a success. As long as you take the lessons that you learned with each of those iterations and bake it into what are you going to do next? So use that to help inform where you're going? And then, finally, I always like to remind teams that Dora goes out and researches as many teams as possible around the world.

61
00:18:33.370 --> 00:18:38.359
Nathen Harvey (he/him): And so our findings are really what's happening in the industry today.

62
00:18:38.570 --> 00:18:42.089
Nathen Harvey (he/him): But the reality is that one you don't work for the world.

63
00:18:42.100 --> 00:19:06.979
Nathen Harvey (he/him): you work for harness, and even within harness, you work on specific areas of harness. So you have to take our broad based findings and contextualize them on your team. And so I like to say that you should use Dora's findings as your hypothesis for the next experiment that you want to run as an example, Dora says that documentation quality really drives software delivery performance and other outcomes.

64
00:19:06.980 --> 00:19:23.569
Nathen Harvey (he/him): So now you can use that as a hypothesis. We believe that by improving our documentation quality as measured by these particular characteristics that Dora helps us measure, we expect to see an improvement in our change lead time. We're going to be able to ship software faster.

65
00:19:23.570 --> 00:19:31.629
Nathen Harvey (he/him): Great! That's a wonderful hypothesis. Now you have an experiment that you've designed, and you can go, run that experiment and see, how does that work.

66
00:19:33.150 --> 00:19:44.039
Dewan Ahmed: I can almost visualize the the product leader and and all the practitioners who champion developer documentation quality doing double thumbs to hear that.

67
00:19:44.470 --> 00:20:11.189
Nathen Harvey (he/him): Well, I would definitely refer you to the last 4 years or so of the Dora research reports. Because we've we've done deep dives into documentation quality and then bringing it back to AI. As I said earlier, we see that as you adopt more AI, your documentation quality is actually improving. And this, I think, is really fascinating. There's some really, really interesting questions here around this, you know first, st how are people using AI,

68
00:20:11.190 --> 00:20:34.930
Nathen Harvey (he/him): especially when it comes to documentation. Well, I think there are 2 big ways. First, st we might use AI to write documentation. So that's good. I don't know. As an engineer I always hated writing documentation from a blank screen. But if AI generates something to help me get started, that's a good leg up, if you will. The other thing that AI is really good at, though, is summarizing existing documents.

69
00:20:35.010 --> 00:20:57.159
Nathen Harvey (he/him): Right? And so, as you're using more AI, you might actually be able to get better utility out of the existing documentation that you have, while you're also improving the quality of the documentation that you and your team have access to. So I think it's a really fascinating way to think about those intersections of how AI is helping across that software delivery lifecycle.

70
00:20:58.020 --> 00:21:27.079
Dewan Ahmed: I'll ask a question here that has been in my mind for a long time, so we know that garbage in garbage out thing, and if you have bad Api design, or let's say you just have a bad product experience. And you're trying to duct tape that with AI and automation. Have you seen the use cases or pain points around that? That something is fundamentally broken, and people are trying to patch up with automation. And AI.

71
00:21:27.260 --> 00:21:52.634
Nathen Harvey (he/him): Yeah, I think that. I I think it's really interesting. And certainly these are things that we have to watch out for. I can imagine. Ai, for example, writing documentation about a feature that doesn't exist in your Api as as an example, right? Maybe it's looking at other Apis similar to the one that it's currently documenting. And it and it, you know, takes some inferences from there and writes a new document for you

72
00:21:53.030 --> 00:22:17.009
Nathen Harvey (he/him): that could be a real challenge. But I think this also points back to this idea that what we need to ensure is that we're getting fast high quality feedback on any changes that we're making, whether that's a documentation change or a code change. We need good ways to assess that change and understand. Is this the correct change to make? Is this not the correct change to make? How else do we need to refine it? To make sure that it's right.

73
00:22:18.540 --> 00:22:33.979
Dewan Ahmed: And when things do go wrong, one team we all rely back to is incident, response. And sre, how do you see AI and automation playing a role in how engineering teams are doing. Incident response.

74
00:22:34.460 --> 00:22:48.099
Nathen Harvey (he/him): Yeah, I think that 1st and foremost, I don't think you can ever certainly not. Today, maybe never is, is too far, but certainly not in the foreseeable future. I do not believe that you can replace

75
00:22:48.100 --> 00:23:15.779
Nathen Harvey (he/him): with AI or with automation the intuition that a human has about how a system works, and that intuition, of course, grows over time. As I work more and more with this system, I have better intuition about it, and as I have more experience dealing with incidents, outages, etc, doing that incident response. I'm learning more and more about how that system works. So I don't think we should be aiming to replace

76
00:23:15.840 --> 00:23:24.939
Nathen Harvey (he/him): those sres, those individuals that are doing incident response. But I do think just like with software engineers, we can augment what they're doing

77
00:23:25.030 --> 00:23:43.420
Nathen Harvey (he/him): through AI. AI is very good at a number of different things. Just think of giving AI a log trace or a stack of logs, and having it summarize them, having it dig through them. I've seen AI be pretty good about spotting anomalies in those logs. Now.

78
00:23:43.440 --> 00:24:05.420
Nathen Harvey (he/him): again, this is where some human intuition might play, and it might spot an anomaly in a log that has nothing to do with the incident that you're currently fighting. And you need that human to sort of evaluate that and provide that feedback, either back to the machines or at least to their incident response. But I do think there's a lot of really fascinating ways that we can start

79
00:24:05.420 --> 00:24:22.140
Nathen Harvey (he/him): thinking about. How do we use AI to augment and to assist the people that are doing incident response? And over time we, as a collective industry, are going to continue experimenting and learning, and hopefully bringing those lessons to light, so that we can all improve together.

80
00:24:23.130 --> 00:24:43.919
Dewan Ahmed: And one thing engineers are trying to improve on is how much code they can contribute to, whether it's just by their choice or by being told that you have to produce more code. And we see AI co-pilots everywhere. So we are entering an era of unimaginable productivity. Well, productivity with a code.

81
00:24:44.140 --> 00:25:03.059
Dewan Ahmed: How do you see that the challenge of managing this much of code because the code is being generated? Are we testing that code? So from Codegen to production. How do you see? AI also limiting the challenges of too much code.

82
00:25:03.530 --> 00:25:19.939
Nathen Harvey (he/him): Yeah, I think you're onto something here. And I think that this is certainly one of the hypotheses we have from the 2024 data. Why is software delivery performance falling off with the adoption of more? AI. One of the reasons is potentially one of our hypothesis is that

83
00:25:19.940 --> 00:25:42.779
Nathen Harvey (he/him): you know as an industry we're very focused on, as you said, Code generation with AI. But the reality is that once you've generated code, it still has to go through a number of steps before it gets into production. Those steps might include documentation, quality assurance, approvals, reviews, etc. And so, if all we use AI for is to generate more code.

84
00:25:42.780 --> 00:25:57.689
Nathen Harvey (he/him): We're actually just exposing some of more of those friction points and making those friction points between writing code and delivering code more acute, we feel the pain of them more acutely. And so I think this is a reminder that we should be looking at

85
00:25:57.690 --> 00:26:21.659
Nathen Harvey (he/him): the entire system and understanding. Where is the friction or the constraint? And then asking the question, How do we improve that constraint? Oh! And by the way, we have this new tool AI in our toolbox. Maybe it can help automate away or eliminate some of that friction. Let's figure out, how do we use AI in places other than just code generation. Now that said

86
00:26:21.950 --> 00:26:22.940
Nathen Harvey (he/him): to on.

87
00:26:23.150 --> 00:26:29.269
Nathen Harvey (he/him): I've been playing around with AI for code generation, and I'll tell you what it is. Super fun.

88
00:26:29.270 --> 00:26:54.219
Nathen Harvey (he/him): It is so fun to use AI to build up things that I might not have even tried to build before. And I can get them working. And they, and they work well. I worry about some of the almost the dopamine addiction that I get when I'm when I'm banging away with an AI agent like things. Just work, but not quite, but like it's just one more thing, and then it'll be ready. Oh, oh, but just one more small thing.

89
00:26:54.220 --> 00:26:59.560
Nathen Harvey (he/him): and then it'll be ready, and I just keep back and forth with the AI. Next thing I know is 3 am. And I'm like, Oh.

90
00:26:59.590 --> 00:27:03.430
Nathen Harvey (he/him): I didn't actually need to do this. In the 1st place, I need to go to bed now.

91
00:27:03.430 --> 00:27:09.279
Dewan Ahmed: A very important question, Nathan, are you one of those people who thank their AI agents?

92
00:27:09.600 --> 00:27:11.169
Dewan Ahmed: After every response.

93
00:27:11.330 --> 00:27:28.510
Nathen Harvey (he/him): Yeah, I wouldn't say that. I thank my AI agent after every response I do tend to, you know. Be polite with it, asking, please, and certainly saying Thank you from here to there, maybe not every single time.

94
00:27:28.510 --> 00:27:43.559
Dewan Ahmed: Well, I don't know if it would appreciate it. Me being Canadian, I try to be overly thankful to it, and I appreciate the feedback, and it also helps me. So I love back. End devops everything. But as soon as you throw some fronted at me

95
00:27:43.620 --> 00:28:00.449
Dewan Ahmed: I got scared like Front end has never been for me until AI Codegen. And then now even I can spin up a nice looking front end, thanks to the the co-pilot. So so we are entering a very strange time indeed.

96
00:28:00.660 --> 00:28:16.619
Nathen Harvey (he/him): Yeah, it is fascinating the work that you can get done so quickly with AI. The concern that I always have that goes along with that is, you know, you're not a front end developer. It just generated some front end that looks really good. Is it maintainable?

97
00:28:16.830 --> 00:28:41.510
Nathen Harvey (he/him): Would someone who is an expert in front end development. Look at that and be happy. And and would they be able to change it over time? These are legitimate concerns, I think you know, we we tend to we tend to almost discount the value or the skill that other people have, especially when AI can come in over the top and essentially do that same work. But you know the work that you do.

98
00:28:41.510 --> 00:28:55.239
Nathen Harvey (he/him): what AI does isn't is not going to replace you. And so you shouldn't expect it to replace those other roles as well. We still need that expertise. We still need that collaboration across both the humans and the AI agents.

99
00:28:55.370 --> 00:29:21.860
Dewan Ahmed: Absolutely the decades of experience that humans learn from AI is taking a snapshot of that and giving us some sort of template to work with, but that is not a template to be deployed to production that needs to be reviewed by real humans with all the privacy concerns the policies and enforcements, and then only with proper feedback loop it can be used.

100
00:29:21.860 --> 00:29:22.590
Nathen Harvey (he/him): Yes.

101
00:29:22.970 --> 00:29:44.219
Dewan Ahmed: Now we're talking about AI a lot. And if you're listening, one of the reason is our theme for Season 4 is AI meets software delivery. And, Nathan, if you want to think of Dora for AI, how those metrics change, and how do you see some of those? The key key points from that report.

102
00:29:44.580 --> 00:30:09.219
Nathen Harvey (he/him): Yeah, for sure. First, st I don't think that we'll change how we measure software delivery performance. I think that those so those 4 key measures of software delivery performance. They're still going to be the same. We want to understand how much change can we move through the system, and how safely or and reliably can we move that change through the system that still matters where I think we're going to see AI playing a bigger role and bringing in potentially new metrics

103
00:30:09.240 --> 00:30:38.120
Nathen Harvey (he/him): is more on both the front end of that right like, how how quickly am I able to generate code? Maybe looking at the number of pull requests that we have per engineer or something like this. Maybe this is going to increase. Maybe the size of those pull requests is going to change all of this really matters. But also, as as we've talked about, like, what are the other stages and steps that that change has to go through before it lands in production. And how are we measuring those things? So I think that

104
00:30:38.240 --> 00:31:02.869
Nathen Harvey (he/him): understanding that AI provides us additional assistance and additional capabilities across those capabilities is really interesting. And we can use some of the existing measures that we have to really understand. How are things going? That, said AI, is also fundamentally changing a number of different things. Some of the things that we've looked into are examples of, how do you feel about code ownership?

105
00:31:02.910 --> 00:31:24.600
Nathen Harvey (he/him): Right? If you wrote the code together with an AI, whose code is that? Is that your code? Is it Gemini's code, is it? Yes, it's like shared code. It's really interesting to start thinking about that. And that's another thing that we're really researching this year is what are those human impacts of generating and using more AI throughout that software development lifecycle.

106
00:31:24.600 --> 00:31:35.230
Nathen Harvey (he/him): So I think that we we continually are finding new metrics and new areas to consider things to look at and understand sort of what's driving that.

107
00:31:35.280 --> 00:31:51.059
Nathen Harvey (he/him): And I'm really happy that together with harness, I've been working on a book, an ebook that's all about measuring the impact of AI development tools that's coming soon. I won't tell you an actual date, but know that it's coming soon, and I would definitely be on the lookout for that.

108
00:31:51.790 --> 00:32:20.490
Dewan Ahmed: So listeners please keep an eye on the book and engineering leaders. Whenever they listen to these podcasts, they try to think of. Let me take one action item. Let me think of one key aspects when I try to blend in AI with software delivery, and I'm pretty sure you'll have in-depth analysis on that upcoming book. But if you want to share one thing that engineering leaders to keep an eye out for when they want to

109
00:32:20.610 --> 00:32:24.240
Dewan Ahmed: integrate AI in software delivery. What would that be?

110
00:32:24.320 --> 00:32:32.540
Nathen Harvey (he/him): I think for me. The one place to integrate AI into software delivery is where your team feels the most pain. And so if you're a leader.

111
00:32:32.540 --> 00:32:57.409
Nathen Harvey (he/him): The 1st thing I would recommend that you do is go ask your team. Where is the pain? Where's the friction? What's preventing you from delivering more value. It might be that they say we can't generate code fast enough. It might also be that we can't get feedback fast enough, or that this change approval process just takes too long. It's too heavy weight. These are all indicators of where you might consider. How do we bring

112
00:32:57.410 --> 00:33:15.000
Nathen Harvey (he/him): AI in to help alleviate some of this pain. And so, as a leader, I would basically give you 2 pieces of advice. One, go ask your team, where does it hurt, and if they can't articulate that very well, the next step I would recommend is, get together the cross functional team

113
00:33:15.000 --> 00:33:37.660
Nathen Harvey (he/him): that's responsible for the application that you're leading and do a value stream map, and that is to map out every step and handoff that a change has to go through from being committed to the version control system all the way through to production. When you bring together a cross functional team and build up that map together, you are almost guaranteed

114
00:33:37.910 --> 00:33:52.299
Nathen Harvey (he/him): to find a number of duplicative tasks, a number of friction, a number of handoffs that are unnecessary, or that are introducing waste in the system, so as a leader, help your team, identify that waste, and then work with them to go, eliminate it.

115
00:33:52.670 --> 00:34:09.719
Dewan Ahmed: There you go. You start from the pain points, and you go from there. I have a friend, Captain Cannery again apologies to the listeners. You can see it, but I have some call it a chicken, some call it a duck. It's not a chicken, it's not a duck, it's a cannery bird.

116
00:34:09.770 --> 00:34:23.189
Dewan Ahmed: the mascot of harness, and Captain Cannery has a question for Nathan Harvey. Captain Cannery wants to know future predictions. So where do you see AI in software delivery in 3 years.

117
00:34:23.289 --> 00:34:51.129
Nathen Harvey (he/him): It's a it's a really great question. Thank you, Captain Canary. Dora is is highly tuned for understanding where we are today, which makes us, it makes it more difficult to make predictions. So this maybe, isn't a Dora prediction as much as it is a Nathan prediction. But I say that in 3 years time we're going to look back at where we are today, and and we're not really like we might.

118
00:34:51.129 --> 00:35:19.119
Nathen Harvey (he/him): we might look back in fondness of. We were at this moment where we had all of this new tech coming into the space. But here we are, 3 years in the future, and we now have a much better understanding of how to best utilize it across the software development lifecycle. The reality is that AI is here to stay. We've seen this in our research. We see this. We live it every single day. All of us do as you look around to what's happening. AI is here to stay.

119
00:35:19.556 --> 00:35:26.339
Nathen Harvey (he/him): And I hope that in 3 years time we will have we will have really grasped

120
00:35:26.519 --> 00:35:54.179
Nathen Harvey (he/him): the best ways to use AI, because the reality is that today each and every one of us is getting better at interacting with Llms. And these AI tools that are available and at the same time. Each and every day the models are getting stronger, the capabilities are getting better. And so I think that we're going to see a real change in how we work, and that change is going to be led by assistance from AI.

121
00:35:55.700 --> 00:36:01.060
Dewan Ahmed: And one thing that would shape the change is feedback from practitioners.

122
00:36:01.060 --> 00:36:01.520
Nathen Harvey (he/him): Absolutely.

123
00:36:01.520 --> 00:36:11.940
Dewan Ahmed: To get your feedback in 2025. Dora Survey. The annual Dora Survey is still open. So, Nathan, one more time could you please tell our listeners about the Survey.

124
00:36:11.940 --> 00:36:18.359
Nathen Harvey (he/him): Yeah, for sure. So this survey is how Dora collects a bunch of our data and insights. But

125
00:36:18.490 --> 00:36:42.499
Nathen Harvey (he/him): I'd love you to take the survey, so that we have more data and insights. But I'd rather you take the survey to help yourself, because I honestly believe that when you sit down and take this survey. And you think about the questions that we're asking. It gives you that moment of reflection. It helps you and your team identify those areas of pain and those areas of potential improvement. And so I actually recommend that individually, as a team.

126
00:36:42.500 --> 00:37:07.290
Nathen Harvey (he/him): each individual takes the survey. And then maybe you schedule, a lunch and learn, or a discussion, a retrospective that talks about the survey. And actually, if you go to Dora, dot dev slash survey, this is where you can take the survey up until July 18.th But on that page you'll also find a link for a survey discussion guide. And this gives you basically some facilitation ideas for host, a meeting with your team to discuss.

127
00:37:07.380 --> 00:37:21.349
Nathen Harvey (he/him): You know what things popped like, what things caught your interest in the survey. What are some changes that you want to make tomorrow on your team. So I recommend taking the survey and then getting together with your team and working through that survey discussion guide.

128
00:37:22.600 --> 00:37:34.310
Dewan Ahmed: On the note of getting together with your teams, where you can get together with your team is devops, days, devops, days is a set of global events throughout the globe. And you can

129
00:37:34.370 --> 00:37:54.260
Dewan Ahmed: network with practitioners, learn about different topics in the space devops days. Dallas is happening September 17th and 18, th and this year harness is a participating sponsor. So drop by at our booth, we'll be raffling off a Nintendo switch, and you can use the code shiptalk to take 20% off

130
00:37:54.260 --> 00:38:10.709
Dewan Ahmed: regular ticket price. In the 1st episode of season, 4 of shipdock, podcast we talked about AI meeting, software delivery with Dora lead and developer advocate at Google Cloud, Nathan Harvey. Nathan shared his experience

131
00:38:10.710 --> 00:38:32.539
Dewan Ahmed: from working with open source communities, the key indicators from Dora. And then what are some of the things to look out for when you're trying to integrate AI in your software delivery. Thank you so much, Nathan. Where can listeners find you online if they want to learn more about your work. And Dora.

132
00:38:32.720 --> 00:38:57.569
Nathen Harvey (he/him): Sure. Probably the best 2 places to find me are either Linkedin or Blue sky. And just note, my 1st name is spelled a bit uniquely. It's NATH EN, and then my last name, Harvey HARV. EY. You can find me with that name on both of those platforms and and Dwan, thank you so much for having me, and I also want to just a really quick shout out to harness, because

133
00:38:57.740 --> 00:39:07.940
Nathen Harvey (he/him): I've been partnering with harness for a long time now, and they're a gold sponsor for this year's Dora research. And I really appreciate that. And, boy, it's so fun working together with you and your team.

134
00:39:08.360 --> 00:39:28.950
Dewan Ahmed: Much appreciated for the kind words looking forward to bump into you at some of the events and listeners. Please take the Dora survey. Let us know your feedback. Until then. My name is Dewan Amit. You can find me on devanhamed.com and Linkedin. We are ship talk. We'll talk about the ins and outs ups and downs of software delivery. Thank you, Nathan.

135
00:39:29.260 --> 00:39:30.530
Nathen Harvey (he/him): Thank you so much.


People on this episode