ShipTalk - SRE, DevOps, Platform Engineering, Software Delivery

Debugging Developer Productivity in an AI-native World with Aravind Putrevu (CodeRabbit)

By Harness Season 4 Episode 2

In this episode of ShipTalk, Chinmay Gaikwad sits down with Aravind Putrevu, Head of Developer Relations at CodeRabbit, and dives into the evolving landscape of AI in software development. Aravind shares insights on the influence of AI tools across the software development lifecycle, from coding and code reviews to automation and security. Aravind highlights the pivotal role of context engineering, the continued importance of human oversight in AI-driven workflows, and how teams can harness diverse AI tools for productivity. The discussion also explores Kubernetes' impact on AI workloads, best practices for developer enablement, and the latest trends in developer communities and learning resources.

WEBVTT

1
00:00:03.240 --> 00:00:05.420

Chinmay Gaikwad: Welcome everyone to another episode of ShipTalk! I’m your host today — Chinmay Gaikwad, Director of Product Marketing at Harness. On this show, we dive into everything related to SRE, DevOps, platform engineering, and software delivery.

3
00:00:21.740 --> 00:00:26.350
Chinmay Gaikwad: I’m really excited about today’s guest — someone who’s deeply involved in this space.

4
00:00:26.620 --> 00:00:38.020
Chinmay Gaikwad: Please welcome Aravind Putrevu, Head of Developer Relations at CodeRabbit, an AI-powered code review platform. Aravind, great to have you on the show!

5
00:00:39.090 --> 00:00:57.729
Aravind Putrevu: Thanks, Chinmay — and thanks to the Harness team. I'm super excited to be on ShipTalk. I’ve actually listened to a few episodes before, including the one with your VP of Engineering. That was a great conversation. Glad to be here and excited to talk.

6
00:00:58.250 --> 00:01:26.820
Chinmay Gaikwad: Awesome! So, to kick things off — Aravind is currently leading developer relations at CodeRabbit. He’s been working in developer-facing roles for a long time, and I’d love to start by learning more about your journey. How did you first get into tech? What sparked your interest? And how did that path eventually lead you into developer relations?

7
00:01:28.100 --> 00:01:54.720
Aravind Putrevu: Yeah, happy to share. So like you, I’m originally from India — and as you know, especially in South India, many students choose STEM after high school. My first exposure to computers was through my cousin, who had an assembled PC. He used to do some basic programming, and I got curious.

8
00:01:54.720 --> 00:02:20.320
Aravind Putrevu: I started experimenting with basic languages like Logo and Turtle — this was before I even hit my teens. That curiosity stuck with me. I didn’t formally study computer science at first — I was placed in mechanical engineering — but I eventually switched and earned my CS degree.

9
00:02:20.320 --> 00:02:45.309
Aravind Putrevu: I joined Infosys after graduation, focusing on data applications, especially in banking and financial services. I worked with major Fortune 10 clients, often in large-scale production environments. A big part of my role involved debugging — not just ops, but serious root-cause analysis across large, complex systems.

10
00:02:45.400 --> 00:02:55.249
Aravind Putrevu: I wasn’t just maintaining systems — I was digging deep into how and why things broke. I worked across a range of data tools — Informatica, IBM MQ, DB2, and more. A lot of whole stack of like Ibm related software. And then I realized that data is so important

12
00:03:19.390 --> 00:03:49.160
Aravind Putrevu: and post which I grew interest into security related to data. And then Mcafee happened and I worked at Mcafee for nearly 3 years, and it was a great learning experience, I mean, I'm not saying it for the sake of it, but because I learned a lot about code quality being a actual engineer in doing a full fledged product working with platform engineering teams. We used to build a platform that others used to consume.

13
00:03:49.370 --> 00:03:58.870
Aravind Putrevu: And then and then I that's where I learned more things about endpoint detection, and ranging more more security, related aspects of

14
00:03:58.940 --> 00:04:17.100
Aravind Putrevu: not just coding. But then also, like data. And you know, more operating systems, how people use technology, large customers. We grew, I think, like 6 to 8 member team to like 30 40 ish team like one team. And then we did like crazy numbers. And

15
00:04:17.100 --> 00:04:37.910
Aravind Putrevu: it was a single product line for Mcafee that grew really well, it's called Mcafee. It's called move. But it's a data center, anti-malware. It has a very interesting name. But then, yeah, for the sake of it, let us think that like, it's data center, anti-malware software. That's where I encountered elasticsearch because I was trying to poc it.

16
00:04:38.180 --> 00:05:04.760
Aravind Putrevu: And then I could not stay alone or like idle. During the weekends I used to go to meetups and run meetups for distillation, Google developer groups, etc, in Bangalore. And then I stumbled across this opportunity reach out to a colleague. And then a developer relations role happened. I always want to be like an educator, someone who could share what I am learning than anything else. And then I think that transpired me into

17
00:05:05.000 --> 00:05:33.089
Aravind Putrevu: this particular area. And then I never looked back because I was one of the 1st ones to do like the go to market and developer relations for elastic in India, eventually se Asia. And then I also did some work in the Angie area, and then and then all across. I eventually moved to Western Europe. That's where I live now. I believe in Amsterdam. But then, yeah, I mean, I work with a lot of other folks in the Pacific time zone, I think Eastern time zones, etc. So in the in the Us.

18
00:05:33.920 --> 00:05:40.300
Chinmay Gaikwad: Got it. So elastic was definitely the 1st company that you worked as a developer relationships person.

19
00:05:40.420 --> 00:06:04.159
Chinmay Gaikwad: and that that basically gave you an insight into what a developer relationship is. So I think, for our listeners who want to pivot into developer relationships. What was your expectation before you went into developer relationships like when the 1st opportunity came by. And then, as you have grown into this role and contributed a lot, what do you think has the expectation matched the reality.

20
00:06:05.440 --> 00:06:26.709
Aravind Putrevu: Well, I think I would 1st answer the question. Expectation did not match the reality, like I always thought, it's more education, but then there's more to it which I'm happy to learn and to be honest today, while you mentioned that I had the devrel. But that's 1 piece of entire puzzle that I look at Code Rabbit. There are more things that I look at like I handle entire, like the developer marketing

21
00:06:26.710 --> 00:06:50.830
Aravind Putrevu: and a lot of go to market related to developer. So I think, like nurtures, campaigns, and things like that. That involves quite a bit of product understanding, technical product understanding. So it's like a mashed up role of several things like you can think of him in growth, marketing, or something like that. So, but when I was pivoting into this area my major inspirations. I don't even know that

22
00:06:50.830 --> 00:06:58.420
Aravind Putrevu: a lot. Many people would know these names, but then they were really crazy, like, there's there's Reto Meir from Android community.

23
00:06:58.420 --> 00:07:23.039
Aravind Putrevu: who's been a great teacher in educating people about Android. And then there are several other folks in the Android and Kotlin community. Back in the day. They were really instrumental in the way that Devrel has been done, especially Google. Folks like Amrit Sanjeev, who's been also like a devrel person for Google Cloud. There are multiple people.

24
00:07:23.250 --> 00:07:39.769
Aravind Putrevu: and then quite a few from Jux as well like Wenkurt, like great inspiration. Scott, Hanselman, I mean. Some of them are like now, vps of devrel, and, like, you know, at a different place in career. But then I look up to him, and I look up to them. And it was really great. I mean, like crazy.

25
00:07:40.500 --> 00:07:56.830
Aravind Putrevu: But when I'm into this devrel I also realized that it's for each company devrel differs. It depends on the current. You know the business goals and the way that they want to structure this role. And what? Because the it could be a very generalistic role.

26
00:07:56.830 --> 00:08:13.810
Aravind Putrevu: It doesn't need to be like while you wear many hats. Sometimes it's more like a product manager. Sometimes it's more like a support person. Sometimes it's just like solution engineering. Sometimes it's just like technical writer, right where you own all the technology part of it. Very rarely. You would do like

27
00:08:13.810 --> 00:08:27.009
Aravind Putrevu: maybe positioning and messaging, but I got to do that as well. So I'm super happy with what I learned in short time, but then, also, like it is a risky place. If you want to stabilize and stay forever and do a lot of things, I think

28
00:08:27.390 --> 00:08:29.520
Aravind Putrevu: it's not a place.

29
00:08:30.480 --> 00:08:31.210
Chinmay Gaikwad: Got it.

30
00:08:31.330 --> 00:08:52.209
Chinmay Gaikwad: Okay, that's interesting to know that you cover such a broad range of responsibilities. Having said that since you got into developer relationships as somebody who loves teaching and mentoring how much part of your day to day is actually teaching and mentoring. And who is that targeted to? Who is your audience for that.

31
00:08:53.720 --> 00:09:07.619
Aravind Putrevu: So if you're looking at yeah, to, if you're asking about today, I think it's a bit lesser because I'm not. I'm not doing a lot of talks and going to conferences and meet ups, etcetera. But then, back in the day, I used to do at least like

32
00:09:07.780 --> 00:09:23.580
Aravind Putrevu: a talk, or to a virtual talk on on a webinar or a place like Zoom, or speaking to a community, and then definitely, one talk by traveling elsewhere to meet developers at a conference at a large meetup in a different city, etc.

33
00:09:23.580 --> 00:09:40.749
Aravind Putrevu: So that's how I used to be. But then it has reduced, as I grew in the ranks, or like, you know. But the way that I look at teaching is more mentorship at this point of time I still work with a lot of communities. Developer community leads. And people who ask me more questions about

34
00:09:41.160 --> 00:10:09.559
Aravind Putrevu: how things were placed like questions around like, How do I pivot into this. How do I continue to grow in this area like is Kubernetes the thing right to continue work on so so more things like that internally as well. There's a team that I work with, and then there's a lot of coaching. There's a lot of mentorship guiding people through the task. So that's what keeps me active and driving impact.

35
00:10:10.600 --> 00:10:15.019
Chinmay Gaikwad: Got it. That makes sense you mentioned Kubernetes in there.

36
00:10:15.260 --> 00:10:19.540
Chinmay Gaikwad: and I think we met at one of the Kubecons previously. So

37
00:10:19.960 --> 00:10:34.480
Chinmay Gaikwad: I'm interested to know on how Kubecon, which is one of the biggest conferences for Kubernetes developers, has changed over the past years, and especially in the recent years, with the advent of Gen. AI.

38
00:10:35.820 --> 00:10:49.110
Aravind Putrevu: Oh, yeah, I think it's a lot that we can talk. Kubernetes is like the Linux of of this entire ecosystem, right? I don't know how many know or understand this. At this point a lot of these inference workloads

39
00:10:49.160 --> 00:11:04.599
Aravind Putrevu: like, say, running, connecting to a Gpu, and then, you know, making sure that your Llm. Is deployed, and then, you know you, you kind of like, manage the to deliver the tokens on an Api is still done through a scalable layer orchestration layer like Kubernetes

40
00:11:04.600 --> 00:11:21.409
Aravind Putrevu: Kubernetes is still the core piece of entire thing, and from an infrastructure point of view it's it's the most important piece. And then over the years I see. Like, yeah, the Kubernetes community and everyone has taken a look at it. Linux Foundation itself acknowledges the entire

41
00:11:21.410 --> 00:11:35.830
Aravind Putrevu: fanfare around the entire Gen. AI. Space. And I believe that the community still has a lot of work to do, like in the way that how the entire, because the ecosystem is is earlier like was very much into

42
00:11:36.370 --> 00:11:58.889
Aravind Putrevu: solving for scale debugging issues. And, like, you know, like solving for observability, running more workloads that are not there yet, like, say, maybe, how do we build for telcos? How do we build for different sort of like oil and energy and gas? So that sort of initiatives. But I think again, there's more new energy into how

43
00:11:59.120 --> 00:12:01.149
Aravind Putrevu: Kubernetes ecosystem is placed

44
00:12:01.260 --> 00:12:10.199
Aravind Putrevu: with with the new Gen. AI thing. So I I believe there's more work to be done. But then I think I also see, like in the last couple of keynotes, that it has changed

45
00:12:10.630 --> 00:12:14.450
Aravind Putrevu: to a certain extent. And then people are already opening in into the all of this. Yeah.

46
00:12:15.110 --> 00:12:15.690
Chinmay Gaikwad: Got it.

47
00:12:15.690 --> 00:12:16.240
Aravind Putrevu: Okay. Cool.

48
00:12:16.610 --> 00:12:40.850
Chinmay Gaikwad: And then specifically in terms of developing applications on kubernetes or non-kubernetes layers. We have seen a lot of things change over the last couple of years, especially since Chatgpt was introduced. How has your approach to application development or talking to developers about application development

49
00:12:40.980 --> 00:12:45.039
Chinmay Gaikwad: changed over the last few years as compared to the years prior to that.

50
00:12:46.220 --> 00:12:56.220
Aravind Putrevu: We could still still, there are still folks who would are like wary of like how and where all of this is going like from a generative AI perspective.

51
00:12:56.320 --> 00:13:24.529
Aravind Putrevu: Someone who has come up come up from like, you know, worked in like very old school, like on-prem. No cloud area. Then, like, you know, private cloud public cloud, and then, you know, some sort of like cloud, like fully cloud plg side of things, and then eventually, now with like a Gen. AI side, I could definitely say that there are still very. But I also noticed that a large part of like folks who are building apps today are like or adopting Gen. AI is enterprises.

52
00:13:24.770 --> 00:13:42.469
Aravind Putrevu: Not many people realize the productivity gains, and then the way that they also build this entire workflow is like, probably starting with a simple Gpt Poc, like, you know, on on Openai, or or like, you know, using a simple rag

53
00:13:42.470 --> 00:13:54.549
Aravind Putrevu: by working with this design studio, or, like, you know, someone who has been a consultant and working with these apps. But then very, very untraditional. If you look at like

54
00:13:54.550 --> 00:14:12.580
Aravind Putrevu: predominantly. People have built Ml center of excellence, have worked building like large data centric platforms, unifying data building data, observability. And a lot of those initiatives, I think, while that's continuing, the space here in the Gen. AI is moving. Really fast developers

55
00:14:12.600 --> 00:14:27.769
Aravind Putrevu: have been using, you know, and and learning from market pretty quickly, and the market also is moving really fast. So so what's a popular library last week might not be a popular library this week. So

56
00:14:27.900 --> 00:14:31.909
Aravind Putrevu: everyone understands this. They understand, digest and move on.

57
00:14:32.120 --> 00:14:49.200
Aravind Putrevu: What might be a problem today might not be a problem for August, like, you know, on the coming months. Right? So so I think it's dynamically changing and then and then rapidly changing. And then also, like, you know, developers are adopting all these techniques

58
00:14:49.320 --> 00:15:03.119
Aravind Putrevu: through by learning through, like the either original source, like the social media handles of brands or something, are probably attached to a specific community. Yeah, on the social media, or like anywhere else. So that's how I see.

59
00:15:04.150 --> 00:15:08.055
Chinmay Gaikwad: Got it. And then just to double click on that

60
00:15:09.510 --> 00:15:13.810
Chinmay Gaikwad: from a developer advocate, slash developer marketing perspective.

61
00:15:14.523 --> 00:15:21.849
Chinmay Gaikwad: Has. How has your approach changed from, let's say, when you were at Elastic versus where you are now at Cold Rabbit.

62
00:15:23.830 --> 00:15:31.819
Aravind Putrevu: When are you asking more about like, say, from from how developers are learning this entire area, or or like.

63
00:15:31.820 --> 00:15:44.162
Chinmay Gaikwad: Learning this entire area and how the current developer advocates are helping developers accelerate their learnings and get to like good quality code whatever that quality is,

64
00:15:45.040 --> 00:15:45.530
Chinmay Gaikwad: yeah.

65
00:15:45.530 --> 00:15:46.170
Aravind Putrevu: Yep.

66
00:15:46.270 --> 00:16:12.289
Aravind Putrevu: So so I think there's 1 thing that I could definitely mention, developers or enterprises have definitely realized that AI coding has been really productive. So you are, you are missing, or like, you know, you don't need to take a hop to go, search and Google find a stack over for a question. Go get the snippet, try it out, and then go back. So it's all happening in your Id or your central Ci CD, pipeline or something.

67
00:16:12.510 --> 00:16:14.699
Aravind Putrevu: It's pretty simple and natural.

68
00:16:15.110 --> 00:16:38.970
Aravind Putrevu: but at the same point of time they're also very about like, you know, enterprises, at least, are very about like shipping slop. So so the way that a lot of like teams were trying to look at like, say, the the evangelists, or even companies they're trying to educate folks through meaningful engineering articles or how we built it. Why, we built it. What is the problem?

69
00:16:39.356 --> 00:16:57.909
Aravind Putrevu: That's been really going more. Say, back in the day. Probably there's a meetup like more in person, centric development, or like probably a large workshop or a webinar. Right? These are like the way that you get to know about a product and all. Now it's probably

70
00:16:58.270 --> 00:17:02.919
Aravind Putrevu: 3 min video which most of it can be explained in 1st 30 seconds

71
00:17:02.980 --> 00:17:31.289
Aravind Putrevu: the attention span is less. But also the way that people are being taught is through through either like the developer relations, teams, communities, or like the brands, like the companies itself is very different, and video continues to be a very big area. For everyone podcast continues to be like a great place to learn new things, interact with new things and understand like people whom even don't have any connection about. So

72
00:17:31.450 --> 00:17:58.620
Aravind Putrevu: so I think the mode or the method has been like has changed a lot while the core source remains like, you know where you need to give authentic content, share your opinion, share how you built the solution, why is it needed, and also tell like known issues and then provide support through like community forums like discord, or, like you have a discourse publicly that you would leverage, people would leverage everything. Gone are the day where

73
00:17:58.620 --> 00:18:10.410
Aravind Putrevu: this long pass of doing and planning for an event, planning for a webinar, asking people to come, join and zoom like if you have something, share it already. So it's pretty pretty quick.

74
00:18:10.830 --> 00:18:12.937
Chinmay Gaikwad: Pretty instant. Okay, that makes sense.

75
00:18:14.850 --> 00:18:36.169
Chinmay Gaikwad: And then we talked about a lot of our developers using Gen. AI tools to write code, wipe coding has been the talk of the town recently. What's your take on wipe coding? Where do you see it? Do you see it being adopted a lot in production environments. Or do you think there is a lot of skepticism around it as well.

76
00:18:36.610 --> 00:18:46.249
Aravind Putrevu: 1st of all, there is a lot of skepticism I would accept, and then a lot of senior developers would not like the term vibecoding, but the way that it has been

77
00:18:46.710 --> 00:19:13.230
Aravind Putrevu: positioned I would not say messaged as well. But then positioned is a bit wrong. Why, I would say, is, if you look at the entire thing the way that I think Andre or someone did it in a tweet. And then it went viral because everyone was wipe coding where you press tab, tab, and then, you know, press, click, click, click, and then there you go. The the solution, or, like the feature, is fully built with an AI coding assistant. Well, it's great. But then

78
00:19:13.230 --> 00:19:32.729
Aravind Putrevu: the work doesn't end. Software engineering doesn't end with that right? Like, forget about docs. Or you need tests that you need to write for the code that is generated. You basically want to make sure that code runs. And then, if you are even a small shop you want to test in different browsers, and it explodes right? You need to ship it to a common place and a lot of that.

79
00:19:32.900 --> 00:19:59.529
Aravind Putrevu: So vibe coding as a term is not like, maybe very much applicable to like the the way that it the entire area. It's a very small piece of place where probably non-tech users could quickly iterate on a prototype without waiting for, like, you know, a real engineer to come and tell like, say, you could use tools like lovable or vnot

80
00:19:59.850 --> 00:20:17.410
Aravind Putrevu: to just do a quick iteration on whatever is there on the site. But still you need something more to do it, so I would not even say that there are like a lot of like wrong way of understanding

81
00:20:17.540 --> 00:20:19.969
Aravind Putrevu: the the term called wipe coding.

82
00:20:20.180 --> 00:20:24.349
Aravind Putrevu: But then, in general there is a bit of net positive around it.

83
00:20:24.530 --> 00:20:40.060
Aravind Putrevu: and and then I think there's more labor that we all could leverage more. And then at least, it's been like, we are inviting more developer population into this. We're making more people developers by using the term like wording, people would understand more.

84
00:20:41.190 --> 00:20:50.240
Chinmay Gaikwad: Yeah, yeah, totally. And then you you said that like a lot of yeah. So so a lot of people come in become developers, write code.

85
00:20:50.550 --> 00:20:58.620
Chinmay Gaikwad: And then the 1st step is essentially code reviews. And you have been working in code reviews for a long time. Now Coderabbit works in code reviews.

86
00:20:58.940 --> 00:21:08.850
Chinmay Gaikwad: How has code reviews changed from a traditional pre wipe coding era to a current wipe, coding era, or even with gen AI tools?

87
00:21:10.010 --> 00:21:35.719
Aravind Putrevu: There is a saying in the security area, right? Wherein in this there's something called seam, right security in information and event management. System, like seam is a very popular solution. Where you have all the security events come in, and then you query to look at signals and then see if there is any security incidents or things like that. And the Secops analyst usually puts in a query, goes out for a coffee comes back, and probably the result is ready. So that's how like

88
00:21:35.720 --> 00:21:41.039
Aravind Putrevu: eventually security ecosystem used to look at seams.

89
00:21:41.060 --> 00:22:08.819
Aravind Putrevu: And after a while, like when Cloud and all these New Age query engines have come in open source like elasticsearch Fristo, and a lot of these right, and it has accelerated quite a bit. And then you don't need to like, go back to get a coffee. The results are available instantly. Apply the same analogy to the current area. Right? Code review has been, I could say it's a bottleneck, and it's like, probably cumbersome. And then

90
00:22:08.840 --> 00:22:34.150
Aravind Putrevu: in some cases it could involve some sort of like, you know, animosity between the developers, because developers are naturally introverted, and then they might not talk as much as like, you know anyone else, let alone someone like me. Right? Maybe I'm talking more. But then you're forcing developers through the process called code review to talk more and collaborate and then exchange ideas. And this is a place for friction.

91
00:22:34.360 --> 00:22:43.709
Aravind Putrevu: And then I think what AI has done apart from the real like, you know, the code quality. Okay, we automate all these linters. We do a lot of that. Apart from that.

92
00:22:43.800 --> 00:22:56.669
Aravind Putrevu: it is not able to. It is not just accelerated. A process of giving that review review very quickly after the change has been made, but also brought in a neutral perspective into this entire code review process.

93
00:22:57.088 --> 00:23:11.629
Aravind Putrevu: No tool, I would. I can definitely say, because as a developer, I have used so many tools in this specific area. No tool has has solved this issue so easily than generative AI plus tools like code rabbit, I think, like

94
00:23:11.700 --> 00:23:36.460
Aravind Putrevu: no tool has solved. Because, like, I've used a lot of code quality tools and code security tools. And they used to do half and half like because they solve a very specific puzzle in this entire area. And then you need to bring, like, you know, say, a dast tool for doing like a Sas tool like you need to bring into all of these, and then it also creates huge amount of tech debt that we kind of like.

95
00:23:36.880 --> 00:23:41.010
Aravind Putrevu: Put it away for some other sprint or something like that. So

96
00:23:41.180 --> 00:24:03.470
Aravind Putrevu: I would say that with the advent of AI, and then with tools like Code Rabbit, you're not accelerating only on the time you're accelerating on producing less bugs. You're accelerating also on making sure the teams are productive and they share useful discussions. And they're focused on the core area. They're not talking about like nitpicks. Style issues

97
00:24:03.500 --> 00:24:26.369
Aravind Putrevu: that your team are that's differently mandated and enforced. It's much better. And then you can focus on whether the concurrency system is perfectly designed for a thousand user load. Right? So things like that. So you can focus on the core business and the need than all these styles and nitpicks, and all

98
00:24:26.470 --> 00:24:35.259
Aravind Putrevu: that being said, I think Code Rabbit also does these find these bugs. And then you know all of these refactor issues. So it's added value in general.

99
00:24:36.080 --> 00:24:39.251
Chinmay Gaikwad: Got it. That's very interesting, because

100
00:24:40.030 --> 00:24:51.890
Chinmay Gaikwad: trust is a a big factor. Whenever I've talked to like developers and companies, trust in Gen. AI tools is a big factor in their decision making.

101
00:24:52.150 --> 00:25:03.990
Chinmay Gaikwad: So with reviews, how how much trust can developers actually put in AI powered reviews and is human in the loop still a mandatory thing? Or is it an optional thing.

102
00:25:04.880 --> 00:25:15.130
Aravind Putrevu: I would at Code Rabbit. We continue to say that we augment the process we don't like. Remove the human in the loop 1st of all, and we always want we don't. We don't push a lot. Many people

103
00:25:15.410 --> 00:25:38.289
Aravind Putrevu: ask us like, you know, if Code Rabbit approves, just push the Pr. Just merge it right, make it autonomous. So we don't have that option in the in the Pr process. It's not just like we are weary of doing that because a lot many people continue to say that Code Rabbit is really great, and you know this has been a cruel tool. But then we always want a human in the loop.

104
00:25:38.340 --> 00:26:02.710
Aravind Putrevu: and to make sure that not just the AI is doing its job. But then, have you felt satisfied about this entire pr, so we bring down the like, we bring down the entire Pr review process by what like, not even 50%, 70% or even more in many teams right? And the only thing that the user needs to do is like, whether

105
00:26:02.940 --> 00:26:11.150
Aravind Putrevu: is this feedback really useful? And then should I go and review something else. Should I go and look at more things? So so yeah.

106
00:26:11.480 --> 00:26:32.319
Aravind Putrevu: that being said, I also want to come back and say that there exists problems with AI, wherein sometimes it hallucinates, and sometimes the data is not like relevant, and that the context probably context build up is not not appropriate. And then maybe maybe the user. Maybe the user hasn't provided the certain type of things, and then.

107
00:26:32.490 --> 00:26:37.759
Aravind Putrevu: due to the model or several other reasons, it might not have given great suggestions.

108
00:26:38.153 --> 00:26:47.020
Aravind Putrevu: But we are working on it like like, I would want to go ahead and say, one more thing like Code Rabbit isn't like in any other wrapper. We don't just send a git diff

109
00:26:47.460 --> 00:27:00.540
Aravind Putrevu: to Llm. And then you get feedback right? There's a lot of like engineering, especially, I could say, context, engineering in place that we build context like 60% of the time just goes into this context, build up phase.

110
00:27:00.680 --> 00:27:24.530
Aravind Putrevu: So so I think we do this verification and vetting process post. The review happens when, before we post it, where there's a verification agent that helps you to do all of this. And despite that sometimes maybe there's like a misses, hits and misses in that area. So we're continuously trying to improve evals, work with the model providers to do all of this. So it's a whole process.

111
00:27:25.450 --> 00:27:34.050
Aravind Putrevu: and it helps, eliminates a lot of things in the in the area, especially when you are generating using AI. I think your review would better be faster.

112
00:27:34.430 --> 00:27:40.700
Chinmay Gaikwad: Yeah, definitely, developers are not that patient. So they definitely need their reviews to be faster.

113
00:27:40.830 --> 00:27:44.010
Chinmay Gaikwad: You touched a bit on context engineering there.

114
00:27:44.570 --> 00:27:48.539
Chinmay Gaikwad: And so, just taking a step back, there was

115
00:27:48.740 --> 00:28:06.170
Chinmay Gaikwad: prompt engineering where people said the output of the Llm. Would depend on how great quality of your prompt is. Now there is context engineering. By the time this pod goes out there might be some another term that might come out. But let's stick to context engineering. Can you

116
00:28:06.440 --> 00:28:13.190
Chinmay Gaikwad: dive a little bit deep into context engineering? Just explain, what what it is and how it differs from prompt engineering.

117
00:28:13.780 --> 00:28:22.450
Aravind Putrevu: Especially for coding, coding tools. It's very context is everything. And context is so important, then.

118
00:28:22.460 --> 00:28:49.119
Aravind Putrevu: than like anything else. If you're talking about generic AI applications. Even their context is so important. And what is this context? How do you build this context? Aware applications? So if you look at like any model, any model like open source off the shelf, anything or like you're looking at like proprietary large language models like Gpt. Cloud, etc. They have a specified set of like input tokens and output tokens

119
00:28:49.120 --> 00:29:15.879
Aravind Putrevu: size right? 128 K. Gpt. 4.1 has 1 million. Gemini, 2.5 has 1 million. So like context windows, they have a it's just like display size. Back in the day when I was doing android development, you have, like certain amount of like size of the phone. And then you can only load that amount of data into that area. And if you scroll up the data will load, the new data will be fetched. So that's we prepare the data. But we don't send the data to client or like we don't show it up.

120
00:29:15.880 --> 00:29:31.319
Aravind Putrevu: So. So I think you would have all seen this this back in the day, and then you you manage the display size properly. So that's what is happening here. You make sure that whatever you could fit into that context relevant information that

121
00:29:31.370 --> 00:30:00.109
Aravind Putrevu: is needed for the Llm. To make a useful decision. If it's a decision, or probably a generation. You would probably fit everything into that window and then let the Llm. Do that task. And then this entire process also needs to be evaluated as well like you should have evals. For whether it's taking the right decisions if I fit this much, how is it performing? What sort of data is getting fitted? Is it? Is it probably processing? So there are a lot of engineering in place.

122
00:30:00.200 --> 00:30:18.209
Aravind Putrevu: And then the the way that this entire. Like, you know, area of like managing the context between multiple calls multiple agents, multiple tasks makes up this context engineering thing. I would still say, this is not really, really new. It's been there for one year or one and a half year.

123
00:30:18.210 --> 00:30:35.220
Aravind Putrevu: like, you know, right? From the time I think those people who have realized this thing and built good applications and made context as a center area to entire thing really won a lot of like customer accolades. And you know, gotten a lot of a prize

124
00:30:35.220 --> 00:30:50.199
Aravind Putrevu: context is everything. It's not just because somebody has made a post like, I think again, Andre has made a post or something, but really, context plays a major role. It could as simple as like, say, especially in the AI code, a simple search.

125
00:30:50.200 --> 00:31:12.470
Aravind Putrevu: find the functions and fit it into your context with the proper prompt, will get you better results than you know you you do. You know what you call make your entire code base into a specific embeddable into an embeddable rag, and then and then, you know, process it over. So that has a different way to look at it. Yeah.

126
00:31:13.020 --> 00:31:14.460
Chinmay Gaikwad: Got it. Yeah.

127
00:31:14.670 --> 00:31:25.500
Chinmay Gaikwad: I think that makes sense just going beyond coding and code reviews. There is so much to the Sdlc right from building to deploying.

128
00:31:25.760 --> 00:31:29.440
Chinmay Gaikwad: to managing incidents to observability.

129
00:31:29.974 --> 00:31:40.479
Chinmay Gaikwad: What area are you most excited about? To see an innovation with Gen. AI tools especially with regards to all of these stages of stls.

130
00:31:42.090 --> 00:32:10.800
Aravind Putrevu: I think, especially the way that in the Sdlc area one of the boring parts for me as a developer is to do estimations tasking like planning right? And then and then, you know, making sure that I mean, this is like the program management part of like working with the software engineering and then post program management. The code quality like what code review I mean, code, rabbit and tools like them do like that area is a bit boring. And then

131
00:32:10.980 --> 00:32:36.720
Aravind Putrevu: after that, I think one of the important points, and then, where everyone would want to like back in the cloud, native era like want to solve for is observability right? I think debugging issues, putting fixes and all. I see a lot of them are already doing it. I don't know. Harness does it or not today. But then, you know, when there is an issue, especially with respect to projects like Apm, wherein you would be able to find issues and send fixes or evaluate fixes

132
00:32:37.010 --> 00:33:06.429
Aravind Putrevu: right in the context of incident or issue, or, like, prepare an incident, report, run a runbook, automatically, create scripts, and run that run book to find the Rca. For an issue. So I think all of these are interesting angles just after your deployment, and eventually I think we could also reach into an area where not even observability. I think the way that you want to

133
00:33:06.430 --> 00:33:10.280
Aravind Putrevu: do blue-green deployments, you want to manage users. I think

134
00:33:10.280 --> 00:33:15.369
Aravind Putrevu: even that that loop will be solved. And then I'm looking forward for

135
00:33:15.570 --> 00:33:21.459
Aravind Putrevu: more of that as well, like the data gathering data science and a lot of that sort of area like post release and post.

136
00:33:21.680 --> 00:33:38.460
Aravind Putrevu: like, you know, launching the product. I think even that area is to be very exciting. But then, yeah, I think planning is one big, big big area, because developers continues to. I mean, we can't be like as explanatory as possible. Right? We will. But then

137
00:33:38.710 --> 00:33:44.390
Aravind Putrevu: there could be more. Llms. Need more information, and it will do better if we do better planning. Yeah.

138
00:33:45.050 --> 00:33:54.319
Chinmay Gaikwad: Right? And then so as with most AI systems, it heavily depends on the data, the quality of the data. With planning.

139
00:33:54.840 --> 00:33:59.700
Chinmay Gaikwad: There is like so much scope to have good data as well as bad data.

140
00:34:00.666 --> 00:34:05.590
Chinmay Gaikwad: For example, if, like, I create a Jira ticket, and I don't put in a lot of description in it.

141
00:34:05.950 --> 00:34:23.449
Chinmay Gaikwad: The AI won't be able to perform that well. Right? So are there any tools or tips that you have seen work in the field around how planning tools can become more explanatory. And AI can actually use that data to actually help with code generation.

142
00:34:24.310 --> 00:34:40.939
Aravind Putrevu: Oh, yeah, yeah, very interesting question. And I don't know why you asked. Because I really love this area at this point. And I wanted. I myself are like researching. And then, you know, looking at a lot of tools and then things. So let us break this into 2 area, like, you know, one is where

143
00:34:41.070 --> 00:34:53.300
Aravind Putrevu: you know the AI coding arena, or like the ecosystem is going. That's that's that involves why it involves planning and why it's so important that we can talk a bit a minute in a minute. But then, when you talk about planning tools.

144
00:34:53.300 --> 00:35:17.549
Aravind Putrevu: yes, I think there are tools like taskmaster. There are like very autonomous tools like Tracer AI cloud code itself has a plan mode. When given a code base, a large code base, it can able to like, you know, figure out entire thing and then bring up like a common codebase.md. Or like cloud.md. File that that understands that entire context of

145
00:35:17.550 --> 00:35:21.599
Aravind Putrevu: how this particular monoreport, like microservices, report, has been built up.

146
00:35:21.630 --> 00:35:22.560
Aravind Putrevu: So

147
00:35:22.870 --> 00:35:31.719
Aravind Putrevu: these tools itself are trying to do the summarization and then trying to put things together, but a little heads up into.

148
00:35:31.720 --> 00:35:55.000
Aravind Putrevu: You know, this plan needs a human intervention at this point of time. It's not fully autonomous. Say, for example, if I want to build, hey, I think, build me an Airbnb clone, and then Airbnb probably have, like at least like 5 to 6 microservices on their homepage, and then each powering a cart, a cart service, a list service for all the things, a search service, things like that.

149
00:35:55.000 --> 00:36:10.059
Aravind Putrevu: and then to be able to for a sub agent, or like a coding agent to build, you need to be able to clearly tell what is what right. So I think you still need a bit of human intervention. You still need to tell what's your common stack.

150
00:36:10.150 --> 00:36:33.999
Aravind Putrevu: So the devtool stack, or like, say, I want to use this database. I want to use this particular front end stack. I want to use typescript as my backend language and use this for unit testing. So I need to be having all of this thing in place. I mean, maybe there's a framework. Maybe there is like a common language or a standard to build up. These things would come up.

151
00:36:34.000 --> 00:36:41.560
Aravind Putrevu: probably. So we'll have to wait and watch on that area. But otherwise planning as an area is very interesting. And then I think, like, I said, planning tools

152
00:36:41.810 --> 00:36:47.049
Aravind Putrevu: are the need of R, and a lot of developers are already using it. I think

153
00:36:47.190 --> 00:37:03.490
Aravind Putrevu: ides like cursor themselves, are coming up with something. But then definitely check out cloud codes, plan, mode tracer AI planning engine. And then you could also take a look at the other taskmaster is another to do engine, which is like mode cli or something

154
00:37:04.670 --> 00:37:09.490
Aravind Putrevu: I want to like. Also come and talk about like, say, the entire area coding area like as you could see

155
00:37:09.670 --> 00:37:22.170
Aravind Putrevu: where. Say, if you have a like large epic in scrum terms, and then you want to break that epic into like stories, and then, you know, you you would want to break them further into tasks and all of that.

156
00:37:22.734 --> 00:37:39.040
Aravind Putrevu: If you have good thought into this area like a senior developer or a product owner or someone like that, and have have come from a technical leadership. Technical position eventually, like, grew up into this thing, and I think that's good enough for you to

157
00:37:39.180 --> 00:38:03.900
Aravind Putrevu: run with all these agents, and then you can create a Kanban board, and then those Kanban board can be assigned to multiple sub agents, and then those sub agents will eventually complete the tasks and push it into like in progress, and like no done stage where you move tasks seamlessly, and then it raises Prs gets reviewed by tools like Code Rabbit. So you have like a plan engine, you have a coding area.

158
00:38:04.150 --> 00:38:13.849
Aravind Putrevu: And then you also have, like a review area. And then you have, like a deployment area set by targets for you. So so it's like a command center.

159
00:38:14.140 --> 00:38:30.609
Aravind Putrevu: like like where you have for your secops or security area like, you have a command center sdlc command center, where you planning gets done by a different agent or a tool. And then you have all these tasks, and then you have this this entire deployment. So so I believe that's where everything is going on.

160
00:38:31.300 --> 00:38:40.589
Chinmay Gaikwad: That that would be very fascinating to see in real life. And I think, yeah, I I would be ridiculous. Yeah, we're getting there.

161
00:38:41.068 --> 00:39:09.669
Chinmay Gaikwad: Okay. So while we are on the point of recommendations, you mentioned a few recommendations. On the things that you found very interesting Claude codes planning mode. For example, what about the other parts of Sdlc, specifically like observability, I know, is very close to your heart. So any exciting technology, any exciting startups, or even like any companies that are doing AI the right way.

162
00:39:11.170 --> 00:39:14.050
Aravind Putrevu: I mean, and I mean, are you asking more in the AI coding area?

163
00:39:14.988 --> 00:39:20.719
Chinmay Gaikwad: In the observability, or like the rest of the sdlc, like software delivery and other stuff.

164
00:39:21.080 --> 00:39:25.823
Aravind Putrevu: Yeah, I think. I would say that I haven't seen a lot of like

165
00:39:26.640 --> 00:39:49.829
Aravind Putrevu: Cgs AI percolate like mainstream into observability. Yet, because I still see, like a lot many people doing dashboarding and some AI assistance here and there. But there is a chance for a lot of like these sre agents and then observability agents to pick up, maybe because of the context or the the way that it, or probably people are using like

166
00:39:50.180 --> 00:39:57.560
Aravind Putrevu: the type of models. I don't know. I haven't seen something like really crazy in that in that area.

167
00:39:57.560 --> 00:40:21.279
Aravind Putrevu: however, because of the nature like, I keep seeing at the different tools, both by harness and several other players in the ecosystem, I think, shows really good success in the, you know, early stage, but I also feel that there are a few startups which try to execute really well on the sre area, but it did not

168
00:40:21.280 --> 00:40:32.000
Aravind Putrevu: take up adoption adoption, because people are weary about giving their keys to production environments. So so that's still there. Yeah.

169
00:40:32.060 --> 00:40:53.579
Aravind Putrevu: So I haven't seen as such. Like, you know, there's 1 crazy tool as in the observability space as such. That takes away because it's a very complex system. And then that requests more and more data points to be unified, and to make like a valuable decision. But then I think small parts can be. I saw, like one of the observability player

170
00:40:54.180 --> 00:41:21.739
Aravind Putrevu: has a start, like, you know, they they raise fixes for these Apm tools. I don't know if it is datadog or sentry, or someone where they have raises these fixes for the AI, using the AI for the Apm issues like, say, if there is like a wrong error code, and then it's coming over and over. Then you analyze and Rc. And then plan it up and then port a Pr give it away. So the user tries to merge it, and then the loop becomes more clearer. Right? So that's the thing.

171
00:41:22.590 --> 00:41:46.680
Chinmay Gaikwad: Got it. Okay. Makes sense. A lot of innovation on the left of the sdlc, I guess. The right has to catch up a bit sooner or later. One thing that I wanted to get your take on is if I am an engineering leader, and with all of these tools about like planning code generation code reviews that you mentioned.

172
00:41:46.680 --> 00:42:06.189
Chinmay Gaikwad: what are some of the best practices you would recommend engineering leaders to adopt is, and it can be, like some, some something that you have seen work in customer environments, or it can also be like, Hey, this like, use one tool across the board. Anything? That you would like to share.

173
00:42:07.320 --> 00:42:35.210
Aravind Putrevu: I mean, there are different strategies and organizations are following from my experience on what I'm seeing. People use a whole lot of tools. It's like for the lack of better example. I think, like it's it's like the taste of what the team want. It's like when I go into a company. And then I was asked like, Oh, do you want a Mac or a windows? Oh, yeah, I would choose a Mac, but it depends right like on the developer. So similarly, I think at this point of time

174
00:42:35.700 --> 00:42:54.070
Aravind Putrevu: most of these tools are at disposal, and the developer's disposal, or the team's disposal, and, like team, or the person would choose their stack accordingly. And then the engineering leadership is also like quite open to let people pick their own, especially AI code generation assistant.

175
00:42:54.100 --> 00:43:17.599
Aravind Putrevu: and then, unless it is super expensive or like not available for them, I also see some teams giving away these. The architects or staff or senior engineers like staff level plus engineers. Tools like cloud code, which probably costs like 30 to $40 in a single run. Right? Could cost that. But then for quick pocs, and then, you know, for quick showcasing of work.

176
00:43:17.770 --> 00:43:20.180
Aravind Putrevu: I think there were.

177
00:43:20.360 --> 00:43:46.689
Aravind Putrevu: There were quite a few tools like these, and then people were giving these options. So there's not one tool I would also personally recommend in general to use a multitude of tools. It's better for you to find. You know, it's not like the same sauce that's being sold in different areas, right? It's also like unique in its own way. When you combine multiple agents, and where each comes with a different, probably different system. Prompt? I hope so, like, you know.

178
00:43:46.690 --> 00:43:51.350
Aravind Putrevu: And then you also have maybe the code generation, not just because

179
00:43:51.410 --> 00:44:14.420
Aravind Putrevu: the company that I work for is doing code reviews. But then, code generation is more commodity in general, like wherein you have a lot of code generation tools. You could also use open source models like Deepseq, not the r. 1, but then the deepseq, v. 3 is really really amazing. Right? You have Quen 72 B, which is very popular. Quencoder is really popular. There's like a lot of these coding pop, coding tools.

180
00:44:14.440 --> 00:44:37.800
Aravind Putrevu: models that you could use off the shelf, and then, you know, just use it. But then, when it comes to other tasks, like probably refactoring, like probably Qa testing. And then, you know, like, you know, screen testing like, like live screen testing with tools like playwright or like code reviews. And you know, deployments observability. Then I think you need a stack. You definitely need to choose something

181
00:44:37.850 --> 00:45:04.619
Aravind Putrevu: like even code based context built up sometimes repo, prompt tool set report prompt would also help. So your team needs an arsenal of these. And these are like, like, you know, different drill bits in a power tools system. So you also need to know how people would use it. Best practices include, like, you know, definitely. Choose an AI coding tool. Give an option to

182
00:45:04.820 --> 00:45:15.540
Aravind Putrevu: build like, you know, an agent recording tool that works on, probably on a, on a central ecosystem. And then, you know, yeah, I think that that would definitely help.

183
00:45:16.880 --> 00:45:28.279
Chinmay Gaikwad: Got it makes sense. And then on the same lines you recently wrote about the developer tool stack with AI can you expand a bit on that.

184
00:45:28.890 --> 00:45:45.500
Aravind Putrevu: Yeah, I think most teams actually have one or the other tools, like, I, I believe, like continuing on what I said last, I think there is like multiple layers in this entire dev tool stack like, say, where you want to generate code.

185
00:45:45.500 --> 00:46:04.910
Aravind Putrevu: you can use like the Id, or like the cloud, like like the Devons of the world, like the background agents and devils of the world. To generate. Like Github also has a coding agent. Jules is one another, like everyone has a cli event. Right? You have these code generation tools, and then you also need something like, you know your

186
00:46:05.130 --> 00:46:24.649
Aravind Putrevu: what do you call your essential tool, like when you want to definitely review that code that is coming in. I'm talking about a team. I'm not just talking about an individual developer, even for individual developer to accelerate really fast. You would definitely want a reviewer because you're shipping like 13,000 lines. Pr, right? You're not shipping like

187
00:46:24.650 --> 00:46:40.730
Aravind Putrevu: smallish prs. So so you need definitely this stack of, like, say, a tap completion or an AI coding assistant tool. And then probably you would need a front end tool, like, you know, app generation for prototyping and all.

188
00:46:40.940 --> 00:47:09.819
Aravind Putrevu: So that's what I wrote about. I generally wrote about the entire arena of like, what is your dev? Stack could look like? What are optional that you could probably look aside. But what is essential and what is like foundational to your entire Dev tool stack, and how it is evolving, and then, if you see more, please feel free to add them, or like share more about it, I think that would be amazing to know.

189
00:47:10.460 --> 00:47:34.000
Chinmay Gaikwad: Yeah, definitely, we'll add them in the show notes. So, okay. So now you, you have written extensively about devtool stack. On a more personal note I'm kind of curious to know about. Have you done an AI project for your own personal thing apart from work? If yes, what was it, and how? What was the outcome of it?

190
00:47:34.520 --> 00:47:38.949
Aravind Putrevu: Yeah, yeah, I did more than a couple of apps. I recently built

191
00:47:39.712 --> 00:47:53.497
Aravind Putrevu: between the conversation that we had and I recently built a Zen Stack Guide Zendesk guide product. I I got curious about how to build a support center and it seems like with an Icms plus

192
00:47:53.870 --> 00:48:15.569
Aravind Putrevu: plus a nice front end. You could do a lot. And I gave Claude Code this task of, like, you know, building a support center planet, build it, and they did a great job of replicating like help.openai.com. I have this project on my github like it's called Help Center. It's decent. I just need some more work. Maybe I could. Maybe I could do that. But that's pretty exciting.

193
00:48:15.600 --> 00:48:34.249
Aravind Putrevu: that what these Llms could build. But another important project that I am working for a long time more than 2, 3 months, but I really want to launch it like a sas app is git for vibe coders a lot of people. I see that struggle with storing code at this point of time, and they can't manage git, you know, I think, for even for a lot of developers.

194
00:48:34.380 --> 00:49:03.800
Aravind Putrevu: They struggle to do merge versus rebase, or like finding. How do I solve merge conflicts? How do I probably like Stash, and then probably pick up another branch, and then again pop this trash and then do all of that. So imagine for a lot of non-tech users they still want to store the code, and they need to have a versioning system that's not so complicated. So just curious to build something. And then some of this is like on different different reports on

195
00:49:03.800 --> 00:49:06.150
Aravind Putrevu: my Github, so I could share that.

196
00:49:06.660 --> 00:49:10.630
Chinmay Gaikwad: Yeah, definitely, I think that's a very interesting project.

197
00:49:10.820 --> 00:49:20.130
Chinmay Gaikwad: I can. I can definitely see some somebody picking that up, actually taking to production. So let's see how that goes. One last thing for me.

198
00:49:20.979 --> 00:49:30.760
Chinmay Gaikwad: Since there has been an explosion of AI tools, there has also been explosion of like where to find information about these AI tools, how to learn them?

199
00:49:30.950 --> 00:49:45.050
Chinmay Gaikwad: What are your go-to sources for learning about the latest stuff, or just tutorials, or just what the community is talking about. Do you have, like a few handful of resources that you could share with our viewers.

200
00:49:45.680 --> 00:50:09.440
Aravind Putrevu: So, to be honest, I think I spend quite a bit of time on Twitter and consume a lot of information through my custom lists that I make up lists with a lot of AI thought leaders, people who share more information about AI in general. There isn't a 1 source that I should say that gives you more information. But then

201
00:50:09.440 --> 00:50:16.050
Aravind Putrevu: I try to capture a lot of these things through AI. I kind of like run a small AI. Agent.

202
00:50:16.050 --> 00:50:29.450
Aravind Putrevu: This is another project that I never talk about this. I have a dev shots and newsletter where I, the AI. Automatically brings up a lot of useful information to me every week, and then I make a catch up called Dev. Catch up every week.

203
00:50:29.450 --> 00:50:46.889
Aravind Putrevu: One is going live today in few hours. So similarly I follow like the but the discard latent space discard newsletter. Pretty amazing latent space pod itself is really great. Everyone should take a look at it.

204
00:50:46.970 --> 00:51:06.510
Aravind Putrevu: And then there are a few social handles like the rundown. AI just gives you all the b 2 c AI information, and there's too much AI going at this point of time for a normal people outside to even look at. But then, I think, especially for a developer. And who's interested in the AI coding area.

205
00:51:06.510 --> 00:51:22.330
Aravind Putrevu: There is nothing better resources than these community centers like, I said, the Newsletter than the brand themselves. They share a lot of interesting blogs and information. And and then, yeah, I think those are like the cool things.

206
00:51:22.870 --> 00:51:43.410
Chinmay Gaikwad: Got it, and we'll also add those in the show notes as well. So definitely you won't miss out on that. I think this was very an incredible conversation about everything in the Sdlc. So Ravin really appreciate the time you took to talk to us today. Any last parting words that you wanted to share.

207
00:51:44.240 --> 00:51:54.500
Aravind Putrevu: No, I definitely want every developer to give AI coding a shot like I think if you are skeptical, use it like, use it like

208
00:51:54.630 --> 00:52:21.469
Aravind Putrevu: like something that a new toy or a new tool. You're learning a new library or like new tool. I would also want to say that the entire Internet is going through a new way to do Internet itself, like New way to do web. And then there's a lot of opportunity for each one of us to contribute to and also build in this in this future. So don't be shy, definitely, get part in it, and probably build something exciting as well, who knows?

209
00:52:21.580 --> 00:52:25.029
Aravind Putrevu: Yeah, you could be building the next app dynamics.

210
00:52:25.790 --> 00:52:31.190
Chinmay Gaikwad: Awesome. This is great, Arvind. So thanks a lot again, and thanks all for listening.

211
00:52:32.010 --> 00:52:32.750
Aravind Putrevu: Thank you.


People on this episode