John Ragsdale & Sanjeet Bali & Timur Yarnall & Bhavesh Mehta 42 min

How Generative AI is Transforming Post-Sales CX


You’ve heard the buzz - generative AI is revolutionizing the post-sales customer experience - but how are leading companies leveraging it effectively? In this panel discussion, industry experts will challenge conventional post-sales practices, sharing how AI-driven innovations are driving personalized support, faster resolutions, and deeper customer engagement. You’ll gain insights into how AI is transforming the way businesses deliver value after the sale, and leave questioning whether your support strategy is keeping pace with what’s possible.



0:00

So this is our first of several panels.

0:04

And this topic, AI, generative AI, all the components of AI is a complex one.

0:13

And we could unpack it for days.

0:17

We have a day, but this is going to be a high impact session.

0:21

We have a mix of great panelists representing different viewpoints in the

0:25

industry, from

0:26

builders, developers, and practitioners in the CX space.

0:31

So I won't steal John's thunder who will introduce the panelists, but I would

0:36

like

0:37

to welcome to the stage the host and moderator, John Ragsdale.

0:41

Many of you know him.

0:43

He is distinguished researcher and vice president of technology ecosystems at

0:49

TSIA.

0:50

Welcome, John.

0:51

And thank you.

0:52

Thank you, right to being here.

0:57

I have been such a fan and supporter of Krishna and support logic since it was

1:02

like five people

1:03

working in a Wii workspace on Stephen's Creek Boulevard.

1:07

So it's fantastic to be here at the first live event.

1:12

So if you think about it, you know, GPT came on the market just like a year and

1:18

ten months

1:18

ago.

1:19

And in my very long, long, long career, it's rare to see something have such a

1:28

quick and

1:29

transformative impact.

1:32

And I think that's because this is one of the first examples of AI that

1:36

everyone could

1:37

instantly understand.

1:39

And it affects you personally.

1:41

It affects you professionally.

1:44

And a survey that TSIA did found that 72% of B2B support organizations were

1:50

already using

1:52

GNAI for agent facing use cases and 50% were experimenting with customer self

1:58

service use

1:59

cases.

2:00

So adoption has been really, really rapid.

2:03

But I know that some companies are still early in their journey.

2:07

I talked to companies still shopping for use cases, trying to get budget.

2:11

But we're going to be speaking with three panelists today who are definitely

2:16

pace setters

2:17

around AI, ML, Gen AI.

2:20

And we're going to be talking about some of the potential for the technology,

2:24

but also

2:24

getting into the business value.

2:27

Because technology is cool, but at the end of the day, if we're not getting ROI

2:32

from it,

2:33

it's a problem.

2:34

So I'd like to welcome our panelists to the stage.

2:37

And first up, we have Sanjit Kaar Bali.

2:42

[APPLAUSE]

2:44

Thank you for coming.

2:49

So Sanjit is senior vice president of global customer support for N4.

2:56

And prior to N4, she held executive positions at Trellix, here technologies and

3:01

mobility.

3:03

And she is also on the advisory board for CSM practice.

3:07

So thank you for being here.

3:09

Our next panelist is Timur Yarnell.

3:11

[APPLAUSE]

3:12

[APPLAUSE]

3:13

[APPLAUSE]

3:14

[APPLAUSE]

3:15

[APPLAUSE]

3:16

[INAUDIBLE]

3:17

Timur is the head of AI and ML business development team for Amazon web

3:24

services startup.

3:26

So he is living at the center of innovation.

3:30

Previously, he was the CEO and co-founder of Neutronian.

3:34

And he has also held executive roles at Glean and Neva.

3:38

And our final panelist today is Bavech Metta.

3:42

[APPLAUSE]

3:44

[INAUDIBLE]

3:45

[APPLAUSE]

3:46

[INAUDIBLE]

3:47

[INAUDIBLE]

3:48

[APPLAUSE]

3:49

Bavech is senior manager of AI and automation for Uber.

3:54

He leads generative AI, conversational AI, and automation initiatives for

3:59

customer support,

4:02

directed in AI for strategy for Uber.

4:05

And previously, he held product engineering and strategy development roles for

4:11

Cisco.

4:12

So we opened the day with Krishna talking about really the silos of the post

4:18

sales customer

4:19

experience.

4:20

And I think it's fair to say, in the best of times, it's been a disjointed

4:25

experience.

4:26

And the worst of times, it's been incredibly painful for customers and for

4:32

employees.

4:34

So I want to start by asking each of the panelists, what do you see as the

4:39

single biggest challenge

4:41

for support organizations today?

4:43

Sanjid, could we start with you?

4:45

Yeah, I think, unfortunately, I don't have a very positive view on where we are

4:50

today.

4:51

So the leaders are leading, I would say, very overheated organizations.

4:56

The complexity is growing, especially in the complex product portfolio

5:00

companies.

5:01

The number of issues inquiries the volume is growing.

5:05

I think what is also happening is that the escalations are happening every day.

5:11

Customers are agents are dealing with outdated technology, not the right tools,

5:16

but then

5:17

to serve the customers right.

5:19

They're constantly in the firefighting mode.

5:21

So the whole view of us always wanting to get to that proactive level of

5:25

support is

5:26

still a laser, which is compounded by the fact that we don't have support

5:32

functions

5:33

as revenue function.

5:34

They still remain the cost functions.

5:36

So there's lots of challenges there.

5:39

And I feel that we're in the right place now with technology helping us scale

5:44

and creating

5:45

those positive experiences, but creating that bandwidth for our support people

5:49

to do the

5:49

right things for our customers.

5:51

So many challenges, but many opportunities to do.

5:54

Yeah, technology, culture, political, a lot of issues.

5:58

A lot of issues.

5:59

I think the escalations are basically the ones which are really really driving

6:03

the companies

6:05

to not fix their customer experience because they're constantly in that mission

6:09

cycle.

6:09

Yeah.

6:10

Yeah.

6:11

But Vash, what are your thoughts on top challenge for support?

6:14

Yeah.

6:15

I'll start by sharing a couple of examples, one from the enterprise business,

6:20

one from

6:20

consumer.

6:21

An enterprise that Cisco used to lead the data production engineering team

6:26

delivering

6:27

snapshot disaster recovery kind of solutions.

6:29

When you engage with customer, they're in a way already disappointed them with

6:34

your product

6:35

offering.

6:36

They come to the table, upset, angry, oftentimes, and oftentimes is the case

6:40

with enterprise

6:41

sales.

6:42

Somebody has put their job online to buy your product, especially let's say if

6:47

you're coming

6:47

from startup.

6:48

So it's not just about resolving that issue at that time.

6:52

There are a lot of emotions in play as well.

6:54

Similar on the consumer side at Uber, for example, if somebody's right doesn't

6:58

arrive

6:59

and they are on their way to the airport missing a critical meeting, food doesn

7:04

't arrive, the

7:05

family goes hungry.

7:07

Again, a lot of emotions in play.

7:10

How do you make them whole?

7:11

Your product experience, product defects have disappointed them.

7:15

In a way, not much has changed with respect to what people are expecting from

7:19

the support.

7:21

They expect you to have their back when things don't quite go as expected.

7:26

It may be a product issue.

7:27

It may be, in let's say, Uber's case, physical world issue, staffing, accidents

7:31

, and what

7:32

lot.

7:33

What is compounded with the situation is there is just enormous amount of, you

7:41

know,

7:41

technologies tools that people are using right now, Slack, WhatsApp, chat,

7:46

other chat interfaces,

7:48

voice.

7:52

What they don't want is they don't want to repeat the context.

7:58

Again and again, as you shift through different ways to help the customer, they

8:04

want fast,

8:05

accurate, context aware, and highly personalized resolutions.

8:10

The same resolution they may not be appropriate for.

8:13

Sometimes it's just acknowledging, hey, I hear you, I got your back and this is

8:18

how

8:18

we are going to resolve the issues.

8:21

As we head into AI, this multimodality is also going to become quite important

8:25

where

8:26

initially because of the business reasons, also we were guiding customers for

8:31

text,

8:32

ideally, optimally async support so that we are able to offload the operations

8:37

to maybe

8:37

cheaper regions.

8:39

Not anymore.

8:40

I think not only Uber operates in the economy, we as a society are operating in

8:46

attention

8:46

to the FASIC economy.

8:47

People want it quick, they want the issue resolved, they want to be on their

8:51

way.

8:52

So that's a big, big challenge but also opportunity that we are heavily leaning

8:57

into AI to solve

8:58

for us.

8:59

Fantastic.

9:00

Tim or what are your thoughts?

9:01

Well, I think Sanjid and Bhavash did quite well.

9:05

It's hard for me to say other than I think in the last presentation, showing

9:11

numbers like

9:12

77% or 81% of our customers want better experience of one faster resolution

9:18

that shows relatively

9:20

speaking what baseline we are.

9:22

I don't know, maybe it was part of speaking on this panel coming up and you're

9:26

aware of

9:27

it but I've had terrible support interactions in the last three weeks that I

9:31

can name.

9:32

Some with major airlines for example, I don't know how many people can say that

9:35

they've

9:35

got bad interactions in the last two or three weeks that come to mind for them.

9:40

So for me, I was trying to fly home from Seattle after we had this amazing

9:45

accelerator

9:46

kick off with Andy Jassy and Matt Garman and the height of technology and I

9:51

just wanted

9:52

to do a same day change on my flight.

9:55

And Delta put me into this infinite loop on my mobile app and on the call where

9:59

I was

9:59

not allowed to do a same day change to SFO because I was booked into SJC.

10:06

Fortunately I took the time and I pulled out my laptop and on my laptop, that

10:12

option was

10:13

there for some reason.

10:15

But it took me about 45 minutes of digging to get out of the infinite customer

10:19

support

10:19

loop.

10:20

Now just think about how basic that is.

10:22

By the way, I have over a million miles on Delta.

10:25

I'm a million mileer on Delta.

10:29

So with that as an anecdote of what we can point to, I am very excited by the

10:34

support

10:34

logic story for a number of reasons because I think the fundamental challenge

10:38

is how are

10:38

we going to use GNAI to augment human capability, all the gloom and doom

10:43

dialogue around replacing

10:45

humans and eliminating call centers to me so far fetched because we're starting

10:48

from

10:48

such a bad experience baseline.

10:51

So if we can use these signals to actually make the experience that much better

10:57

, I won't

10:57

dread calling support on equal par as going to the dentist.

11:02

And that's almost how I feel right now.

11:04

I mean I had another experience that I onboarded to new healthcare because I'm

11:09

only eight months

11:10

in to my role with AWS that was coming from GLEAN.

11:14

And I had two healthcareers going at the same time.

11:18

I had to call in and tell them which one was active.

11:21

And of course it took me six months to do that because I knew what was going to

11:25

happen

11:25

and I didn't have an hour and a half to waste because I'm onboarding to a new

11:28

role.

11:29

So with that being the upside, I think that's the challenge of just what we're

11:32

starting

11:32

with and we can use, I think a lot more we'll talk about using this to augment

11:36

human capability

11:37

is going to be so vital and enable so much better interaction with our

11:45

customers.

11:47

I think that you've all touched on really important challenges.

11:52

I think the biggest underlying challenge is something you mentioned that 60% of

11:56

B2B support

11:57

organizations are cost centers and they don't have the respect, the visibility.

12:03

We all know in this room the impact that a positive support experience has.

12:08

Sometimes getting company executives to understand that.

12:12

I think, and I'm just so sorry to interrupt you, but I think that's where the

12:15

difference

12:15

is because most of the companies are deciding on putting more resources and

12:20

investment in

12:20

customer success, and that's the right strategy.

12:24

But think about it, you cannot build a great customer success if you don't have

12:28

a solid

12:28

foundational support.

12:30

So it's like, we're not fixing support, you cannot fix success, you cannot

12:34

drive that.

12:35

So there is that tactical entity mode that we put support in and not seeing it

12:40

as strategic

12:41

is the biggest mistake that we are making.

12:44

And that's where there's a lot to change in terms of how we look at support in

12:48

terms

12:49

of its operating model.

12:50

We haven't revised that, we haven't gone back and looked into it.

12:53

We're still running support the way people have been running for 20 years.

12:57

So I think there's lots that we need to discuss and that's going to be probably

12:59

in our long

13:00

conversation.

13:01

Yeah, absolutely.

13:02

But I think that support logic is bringing that data to show the strategic

13:09

value of support

13:10

and instead of building all these health scores for success manager, building

13:14

health scores

13:15

to show the correlation between a positive support experience and the long term

13:19

customer

13:19

value.

13:20

So they were the very first company to build health scores for support.

13:24

Okay, well let's dive into some questions and Sanjid, I'm going to start with

13:29

you.

13:29

So comprehensive root cause analysis is basically nonexistent for most

13:34

companies.

13:35

And there's a lot of runs up the hill over the years.

13:39

Back in my early days running support, we had our agents stage a walk out

13:45

because we

13:46

were forcing them to use this decision tree that was so painful that they said

13:51

we would

13:51

rather lose our jobs than use this tool.

13:54

So could you talk about how AI can help with root cause analysis to not only

14:00

streamline

14:01

problem diagnostics but ultimately eliminate problems entirely?

14:05

Yeah, I think Krishna covered it in his presentation today.

14:09

If you look at the entire support logic functionality, it does address that,

14:14

right?

14:15

Like I always said and maybe I said it to Marie you as well that show me a

14:19

company's

14:19

RC and I'll tell you the culture and the capability and maturity of that

14:23

organization.

14:24

And so there are three parts to it which is culture, which is data and which is

14:29

bandwidth.

14:29

So we are all well-intended people wanting to do the right thing for the

14:32

business and

14:33

our customers but we don't have the bandwidth to do things like basic RC.

14:37

And if you don't do that, the systemic aspects of your products and other

14:42

experiences don't

14:43

get resolved so you're again in a constant vicious rush.

14:46

I think this case summarization, predictive analytics, escalation management,

14:51

data insight,

14:52

voice of the customer, all of these elements that support logic enables creates

14:57

that bandwidth

14:58

for us to look at this thing.

15:00

But the data on the ML side of things is about finding the right signals,

15:03

finding the right

15:04

patterns, right? The correlation and the data from different disparate sources,

15:09

all of this

15:10

intelligence is what makes it special for us to be able to do better job at RC.

15:14

Because RC is our painful extensive process but if you want to do it at scale

15:19

then you

15:20

really want to rely on these technologies to give you those patterns and those

15:23

signals

15:23

to device.

15:24

So I think this is the solution for creating that bandwidth and also creating

15:29

that data

15:30

and signals and patterns and inferences to do so.

15:34

And as technology grows more and more complex every year, it's really going to

15:39

be up to

15:39

AI to enable us to do this.

15:41

And think about the precision wrap that we talked about.

15:44

That is the resolution intelligence because in my B2B complex context, just

15:48

giving me

15:48

two knowledge-based articles does not help me.

15:51

And in fact, makes the agents say that I don't want to look at it because I

15:54

know this

15:55

stuff.

15:56

They want something that reduces that complexity and brings that information

16:00

that then enables

16:01

them to do better stuff.

16:03

So in regular B2B to be less complex product portfolio, that's a good way to do

16:09

the user.

16:10

But drag is the way for us to do more.

16:13

Maybe Timor could probably say better things with me but I feel like that's the

16:17

solution,

16:18

that resolution intelligence.

16:20

And the context continuity, that's really, really important.

16:23

And people don't have that.

16:25

Our agents today hop from five systems to six systems to just get basic

16:28

information to

16:29

solve something.

16:30

And by the time they get to that, they have two other escalations that are

16:34

popping up

16:34

somewhere.

16:35

So where do you get an RCA?

16:37

And then you don't do RCA, you don't have system issues.

16:40

So you are in that vicious loop.

16:42

Yeah.

16:43

And if I'm a relative to root cause analysis, oftentimes it's not even clear

16:49

whether it's

16:50

an issue.

16:51

And this is where we implement unsupervised clustering or anomaly detection

16:56

techniques.

16:57

Even flag.

16:58

Yeah.

16:59

So it might be something I'm here.

17:01

And it could be based on various system metrics that you have various business

17:05

metrics

17:05

that you may have.

17:06

For example, every consumer business venture into some sort of fraud.

17:10

At any time fraud is happening, you just don't know whether it's big enough or

17:15

whether it's

17:16

happening, where it's happening.

17:17

So you put some guardrails about, okay, you know what, I want to operate this

17:20

particular

17:21

line of business with this particular region.

17:24

And this is my budget.

17:25

If it goes week after week, a little bit beyond this, then it might be a cause

17:29

of concern.

17:30

So I think AI is already helping there.

17:33

And the second area, and I know support logic also has this feature.

17:36

When you have turn off defects coming in, to get to the root cause analysis,

17:41

you need

17:42

to lend it to the right subject matter expert.

17:44

And how do you route it?

17:46

Based on the intent taxonomy or the content within the support ticket.

17:52

I think that's also where AI is already helping us and showing data.

17:55

I think it's a great point to do.

17:59

I love what you said in terms of the delivery and the use of RAG retrieval

18:02

augmented generation.

18:03

And I guess just for show of hands, how many folks are at organizations that

18:06

are training

18:07

LLMs now or SLMs or some mode?

18:10

Yes.

18:11

Okay.

18:12

So it's pretty good.

18:13

I mean, heavy involvement there.

18:14

So I think that as you call it out, you know, getting that right data at the

18:18

right time

18:18

to the model is so critical.

18:20

And it's not being done well enough.

18:22

I know what you said of just showing two knowledge articles is actually

18:25

probably going to delay

18:26

the interaction that somebody screens through it versus getting right to that

18:31

search point.

18:32

And I think at AWS, and I am in the AWS startup's org, as we're seeing this

18:38

explosion in interest

18:39

here, what has really caught my eye about support logic, you know, besides the

18:44

connect

18:44

integration, and I think we've all seen kind of voice as that next frontier

18:48

kind of tied

18:49

to this.

18:50

So, is we're seeing just an explosion in RLE-HF companies.

18:54

So, you know, reinforcement learning with human feedback.

18:58

Guess what?

18:59

The humans are among the most important, if not the most important aspect of it

19:03

And companies that can provide RLE-HF, I mean, we've seen several that have

19:07

gone from,

19:08

call it, 5 million in revenue, 28 months ago, to over 100 million in ARR today

19:15

because they're

19:15

providing that tremendous service to verify, you know, hey, is the model

19:21

getting the right

19:22

point.

19:23

And what fascinates me about what support logic is doing, it's a bit of a

19:26

stretch and

19:27

obviously I don't know exactly what's on the roadmap.

19:30

But if you start thinking about your support agents is actually verifying, you

19:33

know, what

19:34

the model is bringing you in terms of a score.

19:37

You take those scores and actually start feeding that into some of your model

19:41

training.

19:42

I think that's a fascinating idea and I think it starts turning to are we

19:46

thinking about

19:47

our support folks as a cost center or as a real investment to train this

19:52

incredibly

19:52

valuable resource for the future.

19:54

So, that's part of what I'm quite excited about here and I don't have the depth

19:59

in terms

20:00

of support specifically that you all have but I think that's a very interesting

20:04

, you

20:04

know, pathway to explore.

20:06

And thank you for bringing up the RLE-HF because that was something I wanted to

20:09

ask about.

20:10

So, a lot of buzz about that right now.

20:12

But I should have a question for you.

20:14

Uber has a really unique set of customer profiles.

20:19

So you're dealing with drivers and merchants and customers.

20:24

So how is AI helping you really build the trust across all of these very

20:29

diverse contingents?

20:32

And especially, you know, high-stake interactions, a lot of emotion.

20:36

I can't tell you how many times I've left my cell phone in Uber and call from a

20:41

landline

20:42

panicked about it.

20:44

So how is AI helping with this?

20:46

Yeah, yeah.

20:47

Now this question, I think when it comes to the element of trust, I have a

20:51

favorite story

20:52

to tell it from my childhood.

20:54

Growing up, there was a chewing gum that was available.

20:58

I didn't particularly like the taste of it.

21:01

Moreover, the rapper included some statistics for the game of cricket and if

21:06

you read certain

21:07

number, you were in for a prize.

21:10

The shopkeeper there, he saw me day after day, you know, buying the chewing gum

21:15

, throwing

21:16

it in the trash, just collecting the rapper.

21:18

And after a few days, he's like, you know, what's going on here?

21:22

And I said, you know, I'm in just for that prize.

21:25

So he said, you know, let me save you all the trouble.

21:28

You know, here are the three options for the prize and you can select it.

21:32

You know, I was a, you know, a happy kid.

21:34

I think this, and then I happened to visit that shopkeeper a few years ago.

21:39

The area has significantly, you know, developed, gentrified, a lot of big malls

21:45

and, you know,

21:46

other fancy shops around.

21:47

That shop still exists with, you know, dedicated, loyal client base.

21:52

But that person did for that moment is identified pretty much proactively what

21:59

me as a customer,

22:01

I was looking into for the entire process and delivered that outcome saving me

22:05

money,

22:06

but also buying my loyalty.

22:08

Now, going back to the trust equation, you know, I think what customers are

22:12

looking for

22:13

is, you know, we talked about having your back, but they're also looking for

22:16

accountability.

22:17

When something goes wrong, are you taking ownership of the problem?

22:22

It doesn't matter where the issues, maybe I didn't follow your requirements

22:26

properly.

22:28

Maybe you know, something else happened in the physical world in Uber's case.

22:32

Are you providing consistency?

22:35

Meaning I will run into support issues from time to time.

22:38

Are you going to delight me every single time or is it a one-off thing?

22:41

And I think that's where the RLHF bit is also important.

22:44

How do you take experience from the most, you know, expert agents and trans,

22:50

using AI,

22:51

do it, make it accessible to every single person.

22:55

And so now, you know, how is AI helping?

23:01

You know, it starts with understanding.

23:03

You know, LLMs, they are out of the box, they are extremely good at

23:06

understanding pretty

23:07

much anything you say.

23:08

Now, of course, you know, previous generation of NAB models, you had to train

23:12

the, you know,

23:13

let's say, Google dialogue flow family of models, again, specific sort of

23:16

intense.

23:17

If it falls out of your training reporter, it may just say, I can't help you

23:23

and it was

23:24

very limiting that way.

23:26

Now, the LLM, now you have an opportunity to at least understand what the

23:29

customers are

23:30

saying.

23:32

After you understand, I think that's when you need to have a set of policies,

23:37

business

23:37

policies to respond to the intent in empathetic manner.

23:43

While you are resolving the issue, making sure you are communicating to the

23:47

customer that

23:49

throughout the course, you have their back.

23:51

And I will give you a very simple example of before and after of LLM.

23:57

Your food is running like very common defect for Uber Eats.

24:03

If you ask for the previous generation of NLP models when you train against

24:08

this intent,

24:09

they will say, well, you know, thanks for letting us know, we all make sure, we

24:13

all

24:13

to inform you that your food is still on track and here is a link you can track

24:16

lives.

24:17

Five minutes later, the same customer comes, ask, we'll say the same thing.

24:22

Now, this is, this has been the chief complaint against all the chatbots, no

24:26

matter who has

24:26

implemented them, it's too robotic.

24:29

Now, modern LLMs, they are quite nuanced.

24:34

They know that the customer had asked for this problem before.

24:38

The fact that the customer is bringing the same problem to me again, that means

24:41

anxiety

24:42

level is high.

24:43

You are talking about nuances of emotions.

24:44

So, it will respond, it will keep the facts the same, which is, you know, here

24:49

is the

24:50

driver, here is where you can track it, but explicitly acknowledge the

24:53

elevation in the

24:54

anxiety the customer has.

24:57

And the third element, which is out of somewhat unique to Uber, but I think

25:02

applicable to

25:03

support ecosystem in general.

25:05

A lot of time when, you know, financial transactions are involved, you know, we

25:09

have rider, we have

25:10

delivery partner, we have restaurants and we are trying to run three-sided

25:14

market place

25:14

in case of Uber, it's two-sided market place in terms of mobility.

25:20

While also keeping Uber's business interest in mind, it's impossible to

25:27

sometimes to

25:28

do what individual person has perceived right by them.

25:35

Driver and a delivery partner may want more share of the total revenue,

25:40

merchant may want

25:41

them more share of the total revenue.

25:44

And that's why explaining your decision making through LLM, that is

25:49

personalized, customized

25:51

to those personers will help them understand the effort that you are making to

25:56

do right

25:56

by them.

25:58

And if you do this consistently enough, they will believe you that you have

26:02

that back no

26:03

matter what.

26:06

So my next question is for Sanjid, these AI models are becoming much more

26:11

sophisticated.

26:12

We're looking at automating and streamlining, agent activity to boost

26:17

productivity, but

26:18

how do we do this and keep the customer experience in mind?

26:23

Yeah, I think we have to stick to the human in the loop approach, this over-rot

26:28

ating

26:29

on thinking that we'll be able to take X part of the business and just put it

26:33

on the digital

26:34

AI and supported kind of, I don't think that helps.

26:38

I think we need to keep human in the loop, we need to make sure that we are

26:41

doing this

26:41

continuous feedback and improvement mechanisms.

26:44

We need to also cut this narrative of that, you know, the cost cutting bit of

26:50

replacing

26:50

humans because we are relying on the same set of people to adopt these

26:54

technologies and

26:56

then posing this as a threat to their existence.

26:59

And I would say that we need to craft off mechanisms and workflows which

27:04

utilized, you

27:05

know, this digital ways of doing things, but always keeping the human empathy,

27:10

human interactions

27:11

in mind and not over-rotate on just making everything digital.

27:15

Because that model hasn't worked before, that model won't work.

27:18

There will be a lot of efficiencies that we're going to gain to be able to

27:22

invest more in

27:23

that high value conversation with our in human form as well.

27:27

We need to keep that balance.

27:29

So if support executives are being incented to cut costs, who owns that

27:33

customer experience?

27:35

Is that the CCO's job?

27:37

To make sure that you're maintaining an amazing experience?

27:41

I think we all have to take that responsibility.

27:43

I think it's a support leader's job to make sure that we are well-worzed and

27:48

have conviction

27:50

about our context.

27:51

We know what's good for business.

27:53

And I think it's a support leader's job also to make sure that we are

27:56

converting this cost

27:58

functions to revenue functions, increasing revenue, bringing more scalability.

28:03

And I think the bandwidth that we want to create is also to go and work cross-

28:07

functionally

28:08

to take these product and support interface and insights that we have to design

28:13

better

28:14

experiences.

28:15

So I don't think it's just a CCO's job.

28:17

It's a support leader's job as well.

28:19

And that's why what is it that we sell as ROI when we think about technologies

28:23

and solutions

28:24

like this?

28:25

Do we go and sell that?

28:26

Oh, we're going to have 30% cost reduction?

28:28

Or are we actually talking about the pain?

28:32

So the narrative around the pain has to be not just anecdotal.

28:36

It has to be data driven.

28:40

It has to also tell the impact it's having on our business, the negative impact

28:41

on the

28:42

business as well as customers.

28:44

And now it's closely tied to the retention and the expansion side of business.

28:47

Yeah.

28:48

And topic of whose job it is, I think Amazon certainly has the leadership

28:51

principle customer

28:52

obsession.

28:53

It is everybody's job.

28:56

And I think it has to start from the CEO.

28:57

Darai is a great story on this topic when he was running Expedia.

29:02

A lot of customers, they would buy flight tickets well in advance and they

29:07

would forget.

29:08

And they would be anxiety.

29:09

They would reach out to the customer support.

29:12

Now here was the problem of keeping the job limited to the support function.

29:18

There are so many inquiries coming in to have.

29:20

When is my flight, et cetera, of that nature?

29:27

Because the responsibility lied only with that support function, they solved it

29:30

by hiring

29:31

more agents.

29:32

Outcome was, we never addressed the anxiety party.

29:37

It just as the Expedia business scale, they had to scale the number of agents

29:41

to solve

29:42

this repetitive inquiry.

29:45

When Darai looked at the problem, he said, wait a minute.

29:48

Now why don't we apply tech to it and just send customers reminders, you know,

29:51

a couple

29:52

of days before, hey, I hope you are excited for your flight to New Mexico or

29:58

whatever.

29:59

And guess what?

30:00

First of all, it cut to the root cause of the problem where people may have

30:05

forgotten

30:06

that they had an upcoming flight.

30:07

And they were able to reduce the support cost at the same time.

30:13

This is why I think the teams will need to work together.

30:15

You know, product development team cannot think of supporters and afterthought.

30:20

Supporting the same way, they should not be in the business of solving the

30:24

repetitive defects

30:26

day after day without bringing back the product experience and fixing it on day

30:32

zero.

30:32

So Tamara, I want to ask you a data question.

30:35

So enterprises have oceans of data, AIs generating even more data.

30:42

So where do you think we are headed, particularly with unstructured data and

30:47

the ability to accurately

30:50

query with a generative AI solution that can really get at the meat of both the

30:55

structured

30:56

and the unstructured?

30:57

I mean, how do we avoid the garbage and garbage out problem?

31:01

Yeah, a great point.

31:02

And I think that ties back to points that both Babesh and Sanjid made as like

31:06

part of

31:07

it is we're dealing with with SaaS sprawl, right?

31:09

We've got SaaS sprawl.

31:10

So everyone's taking in so much data.

31:13

I still think 90% of that is going to come in unstructured.

31:16

So where are we headed?

31:17

I think dealing with that, I think LLMs have been incredibly good tools at

31:23

helping to make,

31:25

you know, get some signal from that noise of unstructured data.

31:29

But where we're headed that could be dangerous is a few things.

31:34

The data one, you know, I think an LLM is a university professor and you don't

31:40

need a

31:40

university professor in every engagement, right?

31:42

You can take a better trained model and potentially directed to an SLM, a small

31:47

language model

31:48

or open source because in that particular case, you know, a mechanic

31:51

interaction or

31:52

one you point to that.

31:53

So being able to refine that data down to train the model in a specific point

31:59

is very

32:00

important for managing costs and actually providing, you know, the best

32:02

interaction.

32:03

We've seen small language models really outperform large language models in

32:08

many cases.

32:10

I love the mention of Amazon Bedrock in that.

32:13

I mean, what we're trying to do is make, you know, model choice accessible as

32:17

broadly

32:17

as possible so that testing being continuous.

32:20

And then I think from beyond that, I love what Sanjid said about RAG and Retri

32:43

eval

32:44

at the LLM, right?

33:08

So, again, coming from a place like Glean, which is an ANIVA, so I have a bias

33:13

towards

33:14

search, I have a bias towards RAG, you know, we were able to do much of this

33:18

without doing

33:19

inference, right?

33:20

There's no reason to throw this very heavy compute at inference and that.

33:23

So that kind of all ties back to how do we make sense of this incredible mass

33:27

of unstructured

33:28

data, 90% of the world's data being unstructured.

33:30

And I think by being really smart along these lines and I think support logic

33:34

is really

33:35

can be a really key cog in that stuff that's going to help us get to the right

33:39

data at

33:40

the right time and make this, you know, Gini a profitable more quickly, right?

33:45

Now this is huge expense line.

33:46

So those are the key points that I would call out.

33:49

>> So could you quickly talk about this agentic AI that everybody's so excited

33:56

about at the

33:57

moment?

33:58

I mean, I don't want to run long here, but just what's the potential here?

34:02

Why are people so excited about it?

34:04

I'm sure you've all got thoughts, but can we start with you?

34:07

>> Yeah, maybe we can do a lightning round on it because I'm sure we could do a

34:10

whole

34:10

separate thing.

34:11

>> I think I'm looking at you for expertise.

34:12

>> Yeah.

34:13

Well, I think with agents, again, a voice, I believe voice can be that next

34:17

frontier.

34:17

It is perhaps the first frontier.

34:19

So again, the connect and we are all in on connect and also I mentioned our G

34:25

ini I accelerator

34:26

that we just launched and my team helped pick the North American cohort for the

34:29

startups

34:30

and we picked five out of the 20 are voice focused.

34:34

That's how much we believe voice is being that key generation.

34:38

That being said, I would also say that it's very early.

34:40

We are seeing wins in agentic, but it is quite early in terms of that happening

34:49

So in terms of the, you know, you take your calendly, very simple, you know, is

34:56

that triggered

34:57

by anything natural language or models?

34:59

>> Yeah.

35:00

>> I think agentic is one of those elements to make this work as well from a

35:04

profitability

35:06

standpoint that we'll have to have.

35:08

>> Yeah.

35:09

>> You've got a gist.

35:11

>> Okay.

35:12

>> I'll answer it in, you know, I'll add to it in from two angles.

35:15

One is, you know, sort of a simple example of agentic AI that we can all, we

35:21

could all

35:22

benefit from in future.

35:23

Now, today whoever is interacted with travel concierge from time to time, they

35:29

would help

35:29

you plan the travel itinerary, but you know, a lot of times you have to tell

35:33

them who you

35:34

are and what you like and your preferences including budget.

35:38

And they would fall short up until the point that, well, here is a 10-day

35:41

itinerary to

35:42

this place, but after that, you know, hotel booking flight, everything you need

35:47

to do

35:47

on your own.

35:48

Now, come agentic world, first of all, they will know who you are.

35:54

Just with few phrases, you'll have an itinerary for you to review.

35:58

If you're happy with it, they will go ahead and make the bookings.

36:02

Now this is the promise and of course in support runs, this is greatly

36:06

beneficial as well

36:07

because to answer or resolve support tickets, you'll need to access multiple

36:12

systems, you

36:12

need to read a bunch of information, you'll have to execute bunch of work flow

36:17

sizes,

36:18

sending the email, issuing refund, etc.

36:21

When you can trust AI to do all of that, all of this, you know, like end user

36:25

experience

36:25

can become extremely delightful.

36:30

One of the things that doesn't get talked about enough is imagine today,

36:34

imagine today

36:35

AI has all this capability to read any information from your private knowledge

36:41

base or public

36:42

knowledge base, do research, take actions on your behalf.

36:46

Are you really comfortable with it?

36:47

And it may be a personal choice, it may be a business choice.

36:49

So I think what we will need to come up with is sort of a risk modeling or risk

36:54

framework

36:55

about what is it that we, we as a business feel comfortable with, AI taking

37:00

actions on

37:01

our customers behalf versus not and that risk profile will need to be reviewed,

37:06

pretty

37:07

much on continuous basis as you build your own confidence into AI as your

37:12

customers

37:13

start demanding more of you.

37:16

But it's definitely the part of the future for when it comes to AI, I'm sure,

37:23

Amazon,

37:24

Uber, all of us have invested quite heavily into agent decay.

37:28

I think I don't know if you're seeing the same, but for me with our, we do have

37:32

bedrock

37:33

agents and other points of that.

37:35

Probably the most common scenarios we're seeing for agents so far is opening a

37:39

support ticket

37:40

when the interaction is going well.

37:42

So that's in step one.

37:44

And then interestingly, because I think we all do believe in the human

37:47

interaction, but

37:48

from the security standpoint, though, I think we're also trying to solve for

37:53

the, you know,

37:54

there are over one almost security flags that can happen.

37:58

So having agents who we trust to start doing some prioritization on that, that

38:03

's very

38:03

early.

38:04

But I think those are some of the areas where we see the biggest upside in the

38:06

near term.

38:07

Yeah.

38:08

So we are almost out of time.

38:11

We've got that dangerous before lunch spot.

38:14

We have a hungry audience.

38:16

So I would like to ask each of you to take one minute and give advice to people

38:22

who are

38:22

still early in their AI or gen AI journey.

38:26

What can they do to get started?

38:28

What can they do to make sure they're getting the most value out of their

38:31

investment?

38:32

So one quick take from each of you, Sanji.

38:35

So I would say start down.

38:37

Don't delay this.

38:38

You're already three to four, maybe five years late.

38:42

I think find the pain, find the real pain, find the right use case and get

38:46

started with

38:47

it.

38:48

Remove the friction points and focus on that and stay competitive in terms of

38:53

cost.

38:54

And just just realize the value of that particular use case before you get on

38:57

to the next.

38:58

But start now.

38:59

Great advice.

39:00

Yeah.

39:01

Pervash.

39:02

I would expand on that by saying just increase your innovation funnel.

39:08

One of the things we have seen at Uber and a few other companies that I'm

39:11

interacting

39:12

with worked really, really well is Hackathons.

39:15

And you won't believe the kind of ideas, kind of challenges people are applying

39:21

AI to and

39:21

finding benefits.

39:23

It's almost impossible I would say to dictate top down.

39:27

This is where you will be using AI and this is where you will be getting ROIs.

39:32

Increase your innovation funnel as much as you can.

39:34

And then create a culture of collaboration because you know this idea has come

39:38

from data

39:39

scientists, this product managers, engineers of course, how them work together,

39:43

create

39:44

pods and have sort of early success matrix in mind so that you know whether to

39:50

pull

39:51

the plug or double down on this.

39:52

And the third and most critical factor is also out-executive buying.

39:59

The few startups I'm involved in and I have seen that wherever executives have

40:04

personally

40:05

leaned into generative AI either by coding themselves using GitHub co-pilot, or

40:08

cursor

40:12

with cloud etcetera and or building some of this use case prototype themselves.

40:19

They have a much more grounded perspective on where AI stood and what it can do

40:26

for

40:26

your business.

40:27

Versus leaders who have just maybe read news about what AI can do and PR can

40:33

often be

40:35

more pompous or misleading.

40:37

Those projects are you know I would say internally working with leaders who are

40:42

personally leaned

40:43

in can give you much more rope, can give you much more investment and can

40:48

create a culture

40:49

and framework for you to succeed.

40:51

>> Jim, are you get the closing word?

40:54

>> The right port launch.

40:55

So I would say a kind of combo answer with generative AI, we're talking AI

40:58

specifically

40:59

I'd say stay horizontal for now.

41:01

One thing I love about support logic is the horizontal solution that can

41:04

transform your

41:05

business and what we're seeing at AWS and beyond is that as much as people are

41:09

saying

41:09

hey verticalize verticalize verticalize it's too early and genetic.

41:13

Do something that knocks out a big problem and then build off of that to look

41:17

at your

41:18

gen AI strategy and I like support logic for that is almost a platform to

41:23

attack broadly.

41:24

We saw the same thing when I was a gleaned, a rocket based on horizontal

41:30

knowledge management

41:32

and then combo with that I'd say take the start of mentality and eat your own

41:35

dog food.

41:36

Like do use this.

41:38

If you're not using gen AI yourself daily you're not going to have the

41:43

experience to

41:44

see how it's impacting your customers and getting your own tools and it's hard

41:50

to believe

41:50

it's only been 20 months but the day to day usage has just blown my mind and

41:56

has transformed

41:57

my work life in the last 20 months more so than I can other than mobile.

42:03

So those would be my point.

42:06

My final thought would be it takes the village to build a successful AI project

42:10

and we've

42:11

obviously got brilliant people here with a lot of ideas to share and I know all

42:16

of you

42:17

in the audience probably know at least one thing about AI that no one else

42:20

knows so I

42:21

would really encourage you while you're here to network as much as possible ask

42:25

each other

42:25

questions challenge each other.

42:28

You know we are the people in this room that are going to raise the support

42:33

organization

42:34

up to become a strategic sales.

42:37

This strategic is marketing as strategic as product.

42:40

So let's all work together and hopefully next year we can all get together and

42:46

talk

42:46

about our amazing successes.

42:49

So thank you all very much for joining me.

42:51

Thanks for having us today and have an amazing lunch.

42:54

[BLANK_AUDIO]