John Ragsdale & Judith Platz & Vineet Puri & Sariel Moshe 46 min

The Knowledge Gap: How to Deliver Fast, Accurate Answers to Customer Support Problems


Today, customers expect quick and precise answers to their support problems. A significant trend in information retrieval is the shift from traditional keyword-based search to answer-based interactions. As highlighted by McKinsey, customer support leaders are investing in AI-powered solutions that provide precise answers instead of just search results. This webinar will explore the importance of delivering accurate knowledge to resolve support requests and its potential impact on end-customers. We will discuss how best to use AI to enhance support services while maintaining quality and security standards. Additionally, we’ll delve into the differences between standard and precision RAG and hear how Cvent is finding success with the technology.



0:00

Hello everyone and welcome to today's live webinar, the Knowledge Gap, how to

0:08

deliver

0:08

fast, accurate answers to customer support problems brought to you by TSIA and

0:14

sponsored

0:15

by SupportLogic.

0:16

My name is Vanessa Lucero and I'll be your moderator for today.

0:20

I would now like to introduce our presenters, John Ragsdale, distinguished

0:24

researcher, vice

0:26

president of technology ecosystems for TSIA, Judith Plots, chief customer

0:32

officer for

0:33

support logic, Sariel Moshe, co-founder and chief product officer of XFIND, and

0:40

Vanik

0:40

Burry, senior vice president global client services for SEVENT.

0:46

As with all of our TSA webinars, we have a lot of exciting content to cover in

0:50

the next

0:50

45 minutes.

0:52

So let's jump right in and get started.

0:55

John, over to you.

0:57

Well, thank you Vanessa.

0:59

Hello everyone and welcome to today's webinar.

1:02

We've got a lot of content for you, so I'm going to dive right in a quick peek

1:07

at our agenda

1:08

for the day.

1:09

We're going to open with a discussion about the changing landscape of search

1:13

and knowledge

1:14

retrieval, the potential impact of GNAI and answer engines and Judy and I will

1:19

have a

1:20

bit of a conversation about that.

1:22

I will turn it over to Sariel to talk about Precision RAG, which he has covered

1:28

previously

1:29

with the TSIA audience and how it helped the support experience.

1:34

Then we're going to hear from Vanik on SEVENT's success story.

1:39

TSIA is a SEVENT customer, so I'm very excited to hear what they're working on.

1:46

And hopefully all of you saw the very exciting announcement today that support

1:50

logic and

1:50

XFIND have joined forces, so we're going to talk a little bit about that.

1:55

And we're going to close out with a group discussion about the key elements to

1:59

implementing

2:00

Precision RAG.

2:02

So when we talk about GNAI and self-service, the focus has so far really been

2:09

on the savings

2:10

of productivity of agents, increasing deflection and self-service success.

2:16

But I think that what we all need to keep in mind is improving self-service is

2:20

very important

2:21

from a customer experience standpoint.

2:24

75% of customers say they prefer or occasionally use self-service for support.

2:30

So investing cool new technology in the channel that customers prefer to use

2:35

really makes a

2:36

lot of sense.

2:37

So while I know there is great ROI on deflection and you can use that to make

2:41

your business

2:42

case, you're also meeting the customers where they live.

2:47

We're talking about really the change in the support experience, self-service

2:53

experience

2:54

by introducing this GIN AI.

2:57

And if we think historically that we're all used to that search box, what we're

3:01

really

3:02

talking about is going from a list of possible search matches to dynamically

3:08

created responses,

3:09

which can be personalized based on the account, based on the profiles, and

3:15

pulling data from

3:16

multiple places to create a really accurate, meaningful answer.

3:21

So Judy, you have been in knowledge management circles for almost as long as I

3:28

have.

3:28

Can you talk about why the GIN AI is so important from a customer experience

3:35

standpoint?

3:36

Oh, John, absolutely.

3:39

And it's great to be here with you today and sorry, Ellen, beneath you as well.

3:43

And to the audience, thank you for joining.

3:45

But John, we've talked about this for so long.

3:49

The fact is we have to, like you said, meet the customers where they are, but

3:55

give them

3:56

the lowest effort possible to resolve their questions, to resolve their

4:00

challenges and

4:01

issues.

4:02

But really, we have to get better at the precise answers.

4:06

And what is more frustrating to anybody than when you go into a search and you

4:10

're just

4:11

presented back with a dump?

4:14

You get articles and you release notes and you get documentation.

4:19

And you want to believe that the one at the top is the right thing and that's

4:22

the one

4:23

you should go with.

4:24

But do you ever really know?

4:26

And I think that it's time in the industry.

4:31

This is a problem that needs solving.

4:33

And I'm so excited about the X-Find acquisition that we've made because I think

4:38

we've finally

4:39

solved an easy but difficult problem.

4:43

Yeah, I agree.

4:46

And I think that, you know, IT is really looking to buy as much technology from

4:51

a single provider

4:52

as possible.

4:53

And there was this Venn diagram where support logic and X-Find lived.

4:58

And it's great to see that Venn diagram coming closer together.

5:03

So when we survey TSI members about, you know, their impressions of how GNAI is

5:08

going to

5:09

impact the knowledge management program, the majority of companies understand

5:13

there is

5:14

going to be an impact, potentially major impact.

5:18

And we think about the pain points that we've always heard.

5:21

There's never enough resources to create content.

5:25

How do we do contextual content suggestions for employees in real time when

5:30

they're working

5:31

with a customer?

5:33

How can we create that conversational experience?

5:35

So it's not just a customer searching for one or two words and hoping they're

5:39

going to

5:40

get a search match back that is meaningful.

5:44

And something that support logic is really good at is the automated workflows.

5:48

And we use AI and GNAI to really create a low effort experience for customers,

5:55

as you mentioned,

5:56

but also for employees as well.

5:59

So could you give me some thoughts on really the potential for transformation

6:04

to knowledge

6:05

management support processes in general?

6:09

You hit the nail on the head when you said, you know, it's been a constant

6:12

conversation.

6:13

You know, do the agents have time to create the knowledge content to hit?

6:16

And John, you wrote a paper years ago, the knowledge management maturity model

6:21

that you

6:21

put together, right?

6:23

And we talked about mature organizations.

6:26

Knowledge wasn't just a support challenge.

6:29

It wasn't just a support driven event.

6:32

It was something that was across the entire enterprise.

6:37

And I think, you know, where this is going to be amazing for support

6:41

organizations, they're

6:42

going to benefit, the customers are going to benefit.

6:44

But the entire organization's knowledge will become available to anybody

6:49

searching for

6:50

it.

6:51

And today, you know, it still lives in silos.

6:54

We know that.

6:55

And yet to get it out of those silos has been not fighting, but it's been a

7:01

challenge.

7:02

I think that this transformation and how we're going to go about having

7:06

knowledge available,

7:08

you know, one of the largest repositories for knowledge is in fact the case

7:11

management

7:12

system.

7:13

But I think, John, you used to have a number that I remember reciting when we

7:17

were doing

7:18

assessments that I think it was 17 to 19 places where knowledge could live in

7:25

an organization.

7:27

And that people were searching that many places.

7:30

So this today is really, you know, the opportunity.

7:35

And, you know, I think that we're starting to see the switch where people do

7:39

realize that

7:40

knowledge is not just a support issue.

7:44

It's a company-wide enterprise-wide opportunity.

7:48

Yeah, I agree completely.

7:51

And in this digital world and a customer experience world, anything we can use

7:56

to create better

7:57

content at every step of the customer journey is really going to have a big

8:01

impact on annual

8:02

recurring revenue.

8:04

So clearly support is a great place to start with JNAI, large volume of

8:11

repetitive questions.

8:12

We see from this data that the majority of companies are already experimenting

8:18

with JNAI

8:19

within support, 50 percent external use with customers.

8:23

This data is about six months old.

8:25

So I suspect those numbers are even higher today.

8:28

But I think one of the challenges is there was an article in the San Jose

8:33

Mercury News this

8:34

week that how many billions of dollars that venture capitalists have been

8:38

investing in

8:39

AI companies and JNAI in particular.

8:43

And every week I'm introduced to a new JNAI tool that's, you know, a couple of

8:48

people

8:48

working out of their garage somewhere.

8:51

And IT has been hit with so many requests for projects that I worry that

8:56

support may

8:57

get lost in the shuffle.

8:58

We know that support isn't always seen as the most strategic place for

9:03

investments.

9:04

But because they have such a strong business case, I think that they easily can

9:09

make that

9:09

case with IT to make sure that they're top of the funnel when they're looking

9:14

at IT resources.

9:16

Is that something that you are saying is support being successful and getting

9:21

high on the priority

9:22

list for projects?

9:27

I'm not going to say that we're 100% there yet, but we are certainly seeing

9:32

more success.

9:33

And I think the interesting thing about it is, you know, support leaders have

9:37

been looking

9:38

at AI for a while.

9:40

They really have.

9:41

They've been poking around and exploring.

9:43

Now this new onset of AI has changed it.

9:46

You know, all organizations are talking about AI all the time, right?

9:50

And yet support has started.

9:53

And I think, again, support logic, having a platform that's been out there now

9:58

for eight

9:59

years.

10:00

Thankfully, we've earned the trust of organizations.

10:04

And I think that AI companies that are starting up, you know, it's just, like

10:09

you said, they're

10:09

the smallest.

10:11

We say they're in a garage, but it's two or three people who are doing amazing

10:14

things.

10:15

And I'd say to support leaders, stop viewing IT right as the enemy and start to

10:22

really

10:23

build those relationships because I think that IT truly does want to be there

10:27

and recognizes

10:28

that support is not just asking for tools for the sake of asking for tools.

10:32

They really are asking for things that will benefit the entire organization.

10:37

It just happens to be starting in support.

10:40

But we've all remember, you know, being prioritized after let's say sales or

10:44

after marketing,

10:46

right?

10:47

It's always happened or after product and engineering.

10:49

But stay strong in this pursuit and really build those business relationships

10:54

with IT

10:54

because again, not bringing in niche, but bringing in platform plays and

10:59

bringing in

10:59

a combined tool is super, super important.

11:03

And more support leaders that I've talked to recently are looking at and not

11:08

looking

11:09

at, you know, having 15 niche providers or 25 niche providers.

11:15

And I think that brings us to this data point about where people are going for

11:22

Ginii capabilities.

11:23

The majority of companies are looking for Ginii as part of an intelligent

11:28

search platform,

11:29

analytics based search, cognitive search, whatever you want to call it, tools

11:32

like X-Mind

11:34

because they're already doing so much machine learning on what people are

11:38

asking the most

11:39

popular content that they're selecting when they're looking for a particular

11:43

answer and

11:44

putting Ginii on top of that means that you're going to get time to value much

11:49

faster.

11:50

So all of these great tools that are coming on the market that are just Ginii,

11:55

in my opinion,

11:56

are going to take much longer to validate, to do the training on your content

12:01

and ultimately

12:02

to prove useful to employees and for customers.

12:07

So again, I'm thrilled to see that X-Mind is now part of support logic.

12:12

I think that's a very logical next step for people evaluating technology.

12:18

But you know, you mentioned I've been doing webinars on AI for a decade now and

12:23

Ginii

12:24

has been out maybe 18 months, but it sounds like you're really seeing the

12:29

conversation

12:30

shifting by executives.

12:33

What are you hearing from your customers and prospects?

12:36

Yeah, I think what we're seeing here, John, and you know, your slide is valid

12:42

ating that

12:43

trust is growing, right?

12:44

AI is not going away.

12:46

It's not just that flash in the pan.

12:48

It's here.

12:49

But I think what customers really are wanting is the deep business experience

12:56

and understanding

12:58

in the industry knowledge.

13:00

And I think one thing that we know is you have to understand support to be able

13:07

to truly

13:08

give support tools because you can't just create a tool and think that you know

13:14

support.

13:14

Every support organization is unique and different, but there is so much

13:19

complexity to running

13:20

a support work.

13:21

And I think that our customers truly, truly deserve enterprise solutions today,

13:27

not experiments.

13:28

And I think a lot of the tools that we're seeing feel like experiments.

13:33

Yeah, absolutely.

13:35

Absolutely.

13:36

And while I admire, we just did a case study with that five last week and they

13:40

're a great

13:41

example of a company that will throw anything at the wall to see what sticks

13:45

and even if

13:46

only 10% of them work.

13:48

But not everybody has the budget or the culture to process like that.

13:53

So enterprise grade solutions are really critical.

13:59

Well I mentioned RAG, which I think everybody is familiar with now, retrieval,

14:03

augmented

14:04

generation.

14:05

And before I turn it over to Sariel to dive into that, I wanted to do a plug.

14:09

The very first time we introduced the concept of RAG, two TSI members, was

14:16

October of 2023.

14:18

And Sariel did an amazing webinar, you can find it on our website if you want

14:23

to watch

14:23

the on demand.

14:25

Really explaining what it is, how you use it, what the advantages are.

14:29

But boy, since October, things have moved really, really quickly.

14:33

So Sariel, I'd love to turn things over to you to kind of give us an update on

14:38

RAG,

14:39

now becoming precision RAG and what you've been seeing as the evolution.

14:45

Thank you, John.

14:47

Thank you, Judy.

14:48

And I'll just note that I'm really, really excited about us joining support, Aj

14:53

ic.

14:54

It's really, I'm very excited about what we're going to be building together in

14:58

the future.

15:01

But yeah, to go back to my webinar with you a year ago, so back then RAG was a

15:08

concept

15:09

that was just coming out.

15:10

It was a few months old retrieval augmented generation, which was pretty much,

15:17

people

15:17

are understanding that, okay, we had now we have these large language models.

15:22

And Chachapita is showing us what's possible to do with them.

15:26

How do we implement these in an enterprise context without hallucinations,

15:33

without it

15:34

making up stuff?

15:35

How do we actually get an enterprise grade experience with this new powerful

15:40

engine that

15:41

we're seeing?

15:43

And back then it was a concept everyone was getting excited about, but everyone

15:48

was also

15:49

very worried about, right?

15:51

AI, this new thing, we don't know what it's going to do.

15:54

We're going to take a step back and let the early adapters try it out.

16:03

And as you said, we've come a long way since, I think both in the willingness

16:07

to try it

16:08

out, just like you showed in the slide before, and in the power of the large

16:14

language models

16:15

themselves.

16:16

I mean, it's kind of hard to miss.

16:19

Like every week there's a new model coming out.

16:22

We have llama, we have clog, we have open AI, everyone's competing to build the

16:26

new

16:26

large language model and small language models as well.

16:30

So a lot's going on, but one thing hasn't changed since last year, and it's

16:37

actually

16:38

become even clearer.

16:39

And this goes back to your discussion with Judy, which is there are quite a few

16:45

challenges

16:46

to developing RAG that not only works, but is actually reliable and reliable in

16:51

an enterprise

16:52

setting.

16:54

And when I talk about an enterprise setting, and support leaders obviously know

17:01

that we're

17:02

not just talking about how do we take a nicely built knowledge base and use

17:08

that to power

17:09

answers in whatever use case we want.

17:13

We want to really be able to work with the black sheep of knowledge, right?

17:17

Last cases, Jira, Slack channels, and to be able to work with those, that is a

17:24

whole

17:25

another level of complexity.

17:28

And by the way, even knowledge bases, nicely organized knowledge bases, if they

17:33

're very

17:33

domain specific, like we're talking about cybersecurity companies or hardware

17:39

companies,

17:40

in those cases, even then the large language models won't be able to really do

17:44

the work

17:45

themselves.

17:46

And we don't have enough of context yet, at least, to be able to really deal

17:50

with that

17:51

by themselves.

17:52

And so it can't just be a wrapper to chat to be tier whatever large language

17:56

model you

17:57

want.

17:59

And actually, many companies, we've seen this in the past year, many companies

18:01

have been

18:02

trying to build it by themselves saying, okay, we'll just take a vector

18:07

database, push

18:08

on embedding into it, connect it to open AI, and it'll probably work.

18:12

And it just doesn't, in many cases.

18:15

And really, the challenges come down to a few different places in the process.

18:24

And that's what I'd like to describe in this slide.

18:29

So the challenges, first of all, are really the data source complexity.

18:36

As I said, domain specific, the black sheep, the past cases in Jira, you can't

18:41

just throw

18:42

that at a large language model and expect magic.

18:45

If it doesn't know the context, if it's an unorganized case with 100 comments

18:51

long,

18:52

it's just it's not going to work.

18:54

So you really need to be able to deal with that complexity.

18:58

B is, and I think this is maybe the most important point, is that one of the

19:03

biggest misconceptions

19:05

I've been coming across again and again, people don't understand that, rag, the

19:10

actually

19:11

the most important part of it is not the large language model at the end, but

19:15

the search engine

19:16

that's feeding that large language model with the context.

19:20

And if you don't get a good enough search engine, and this goes again back to

19:23

one of

19:23

your slides earlier, that companies are aiming for the vendors that can combine

19:29

those two,

19:29

that can provide you a natural language search engine, plus the AI element,

19:34

because those

19:35

really go together.

19:36

If you're not able to focus on the right context, then you're not going to feed

19:42

the large language

19:42

model with relevant enough information to actually develop a good answer.

19:47

Another point in that area is the cost and the context window.

19:54

So a lot of these new models coming in, one of the big numbers they're talking

20:00

about

20:00

are the context window.

20:01

How much information you can feed them in a prompt, how much information they

20:06

can feed

20:07

out in a prompt.

20:09

But the problem is, again, you can throw as much information as you want at

20:14

these large

20:15

language models.

20:17

It's not going to work if it's just not good enough for complex scenarios.

20:23

You really have to be very focused in the specific parts of information you're

20:28

feeding

20:28

it, and this also goes into cost.

20:32

If you're just playing around with Chatsby-T, that's nice.

20:35

But if you want to build this at scale for companies that have tens or hundreds

20:40

of thousands

20:41

of cases a year, that's a whole different question.

20:47

And really, you have to be very precise in the way you use these large language

20:52

models.

20:52

Fourth challenge, and then I'll get into how we deal with these.

20:56

The fourth challenge is search failure.

20:59

And that's probably the most important one that people don't even think about,

21:04

which

21:04

is more importantly, especially in support, but not only in support, more

21:11

importantly

21:12

than being able to answer the question with the knowledge that's in place, is

21:17

to be able

21:18

to know when you don't have an answer and not answer at all.

21:22

Because in support, we're dealing with customers that can get angry very

21:27

quickly if you just

21:28

throw at them garbage, and we don't want to throw at them garbage.

21:31

So it's really critical.

21:33

We're able to detect when we don't have the relevant information, and then we

21:39

can maybe

21:39

use other sources, or we can just take it back to the agent, but by all means,

21:44

not

21:45

answer, near relevant answer.

21:48

So how do we deal with all these challenges in X-fine?

21:53

So first things first is, and this is a very important piece that, again, I

21:59

think a lot

22:00

of do-it-yourself approaches totally ignore, which is parsing and cleaning the

22:06

data.

22:07

So this is true for all different types of data, and you need to be able to do

22:13

it very

22:14

efficiently.

22:15

But for talking about, again, a large knowledge base, like 10,000 articles, and

22:20

it's constantly

22:21

updating, hundreds of thousands of cases, you need to be able to efficiently

22:24

take those

22:25

items, clean them, and make them ready to feed internal large items as well,

22:30

and to

22:30

feed into the search engine before that.

22:34

That's number one.

22:35

That's one important point that we've built.

22:39

Technology specifically, pipelines that are able to take cases, they're able to

22:42

take knowledge

22:43

articles, they're able to take geo-tickets, and efficiently parse them and

22:48

clean them,

22:49

and do this at scale across many different types of customers and many

22:52

different types

22:53

of use cases.

22:55

The second part is, again, the vector DB is like the do-it-yourself approach.

23:00

We don't take that approach.

23:02

We say we want a much more robust search engine that takes into account any

23:07

model and any

23:08

method that's available out there that can really catch context.

23:14

Why is catching context important?

23:16

Because if we're talking about natural language curries, and later we'll see

23:20

that natural language

23:21

curries can actually be an entire support case as a query, we want to be able

23:24

to really

23:25

understand what's going on in that case.

23:28

An embedding an infected database can do that not in the level of precision

23:34

that we're expecting.

23:36

We really need to be able to catch many different elements in the information

23:39

on a lexical level

23:41

on a semantic level.

23:43

We really want a much more robust search engine feeding the information.

23:48

Then we come to how do we focus on the specific context, and this is the

23:54

passages part.

23:55

The idea is, okay, we found the top most relevant items, knowledge article or

24:01

case or ticket,

24:03

that we want to feed the large language model.

24:05

Right now, but at this point we don't want to feed the entire item, we want to

24:09

feed the

24:09

specific part that's actually going to answer the question again, because of

24:14

cost and because

24:15

of making sure that we're actually going to get a good answer.

24:18

And then come the guardrails.

24:19

The guardrails, what they're doing is again avoiding answering questions, but

24:24

we don't

24:25

actually have any relevant knowledge.

24:27

The guardrails really break down into two, and these are proprietary technology

24:31

that we've

24:31

developed over the years.

24:34

One is detecting when the query is not actually related at all to the knowledge

24:40

base.

24:41

So maybe you've heard of all these cases where customers come in and ask an

24:46

unrelated question

24:47

to a chatbot and the chatbot just answers, even though it's totally unrelated

24:52

to what

24:52

that company is doing.

24:54

We want to stop that at the get-go.

24:57

If someone's coming in and asking, "How do I walk my dog?"

25:00

That's not relevant to this chatbot.

25:02

No answers.

25:03

We can't find any answer for that.

25:06

Number two is even if it is relevant to the knowledge base, if we can't find

25:10

any actually

25:11

relevant item in the list provided, we want to tell that to the user.

25:16

We have this list, but we don't believe any of these items are actually going

25:19

to answer

25:19

your question, "What do you want to do at this point?"

25:23

So that's what we've been building, and we've been really focusing on obviously

25:27

for the past

25:28

year, year and a half, and that's what we're bringing into SupportLogic.

25:31

This is really technology at a whole other level than what I believe I've seen

25:36

is really

25:37

out there in the market at the moment in terms of how do we build, again, this

25:44

rag process

25:45

that can power many different use cases, and we'll touch that later on, in

25:50

complex scenarios,

25:52

dealing with complex data and as enterprise grade for the most complex types of

25:58

customer

25:58

support and otherwise in the enterprise.

26:04

Now, moving on to our work with C-VEN.

26:17

So I'd like to start out before I hand it over to Judy and Benite.

26:22

Maybe start out, and this again connects to what we've discussed earlier, is

26:26

how X-FIND

26:27

got into C-VEN in the first place, how we started working with C-VEN.

26:31

So this is about three years ago, they reached out to us, the C-VEN support

26:39

team, and what

26:40

they were telling us was that their customers were very unhappy, and again,

26:44

this is pre-AI,

26:45

pre-Chatsubite craze.

26:48

Their customers were very unhappy with the search results that we're seeing in

26:54

the portal.

26:54

They were asking questions but not getting relevant enough results, and it was

26:59

just not

26:59

a good enough experience.

27:03

And they went ahead and said, "Okay, we need really to improve this experience.

27:06

We want to make what's called self-service much more efficient for our

27:10

customers, right?

27:11

How do we help them find the information they need to solve their own issues

27:15

without having

27:16

to send in cases?"

27:18

And this is a classic issue that many enterprises deal with.

27:22

Those keyword-based search bars that are still prevalent in many different

27:28

support portals,

27:30

they're not usually really adapted to the needs of enterprises, again,

27:33

specifically

27:34

in complex data scenarios and domain-specific issues.

27:40

So they set out to find a superior search experience, and we were one of the

27:43

vendors

27:43

that they reached out to.

27:46

And I think one of the things they really liked about us, it was the our

27:50

ability at

27:51

X-Find to deal with complex natural language queries very simply without

27:56

requiring really

27:57

any work on C-VentSide or on the customer side to filter or focus or do

28:02

anything of

28:03

that sort tag on their end.

28:07

Our engine, as I said earlier, the search engine itself is robust enough to be

28:11

able to

28:11

deal with long queries, complex queries, and retrieve relevant items at the top

28:16

of the

28:16

list.

28:19

And they even ran a benchmark comparing us to other options they had, and we

28:24

came up on

28:25

top.

28:26

Actually, I'm parr and even better than what Google was showing them on their

28:33

data.

28:34

So that's how we got into C-Vent.

28:36

It's been, I love working with them.

28:39

Veneet, really, you have a great team there.

28:43

I think at this point, I hand it over to you and to Judy to discuss how you're

28:49

working

28:50

on these topics these days.

28:55

Thank you, Saria.

28:56

Veneet, hi.

28:57

Glad that we're together here today.

28:59

Hi, Judy.

29:01

And yeah, thanks, Saria.

29:04

I kind of agree with most of what you've said.

29:07

Judy and John, it's a pleasure to be here, especially Judy.

29:10

We've known each other for many years now between support logic and your

29:16

previous life,

29:18

I could say.

29:19

So really glad to be here.

29:23

Wonderful.

29:24

Veneet, you're known as a transformational leader in the industry, and you have

29:29

an amazing

29:29

reputation.

29:31

So I'm going to ask you a couple of tough questions, because I think that

29:34

people would

29:35

love to learn from you a little bit.

29:38

Tell me some of the things that you've been trying to do at C-Vent to give your

29:43

customers

29:44

a better experience.

29:46

Sure.

29:47

First, thanks for the compliment.

29:49

Means a lot.

29:50

You'll remember the time when we first spoke and we were talking about, "Hey,

29:55

we want to

29:55

have a digital first strategy."

29:58

Frankly, I think we started in 2020.

30:02

We need to move away from the way we're doing things, and the approach we took

30:10

was give

30:10

the customer a lot of the channels they need.

30:13

Don't push them on a particular channel.

30:14

I hate companies and leaders when they push customers through a self-service

30:22

channel by

30:23

hiding the 1-800 number behind multiple pages, multiple days there.

30:28

So the idea we had was very simple.

30:30

We wanted to go ahead and give our customers all options, and that meant self-

30:35

service options

30:36

as well.

30:37

Now, like Seveal said, we were having a difficult time getting the right data

30:42

pulled out, the

30:43

right articles pulled out in some context.

30:46

C-Vent is a full-blown platform.

30:49

We do end-to-end from simple events to the most complex events, thousands of

30:54

features,

30:55

two completely different sets of offerings which are related.

31:01

So a lot of similar terminology across different products.

31:06

Basically, what I'm trying to say is fairly complex setup for you to be able to

31:12

key in

31:13

a particular search item in the search bar and hope to get the right answer.

31:17

That's what we're dealing with.

31:18

Talking about multiple products, similar lingo, a lot of context is required.

31:26

You could be in the product to do six different jobs in the same page.

31:33

So unless you could ask a query and the search engine could actually gather the

31:41

context,

31:42

you would not be able to get the right answers.

31:44

Anyway, that's some context.

31:46

So we set off on this journey and we were very fortunate to find X-find as a

31:54

partner.

31:55

We evaluated multiple providers to be very honest and fair.

31:59

When we were looking at X-find, we were like, still a startup.

32:02

Are we going to try that search engine?

32:06

But the results were really astonishing if I could.

32:10

We were getting what we needed.

32:11

We never thought that's going to be the case.

32:13

Anyway, we started with that.

32:15

What X-find did for us really was three things I can think of.

32:21

One unified search.

32:22

When we're talking about pulling information, we're pulling information from

32:27

our training

32:28

content, from our KV articles, from our community forums.

32:36

All of the knowledge could be designing anywhere.

32:38

It's actually helping us pull out the right kind of content from there.

32:42

What it's also helped us do really was unified experience for our customers.

32:47

It doesn't matter where you are, you could still get that same experience.

32:51

Third was the soul, exactly actually, federated and contextual search to be

32:56

able to get the

32:56

right answers.

32:57

So that's point one.

32:59

Before we got to that really, just sorry, digressing a little, but I agree with

33:04

surreal,

33:05

a lot of the work to be done together, this kind of a transformation

33:11

implementation done.

33:14

A lot is said, but it's really hard to be able to, especially because of all of

33:18

this

33:19

technology is fairly recent.

33:20

We're coming out of the woods from SOPs and decision trees to federated search

33:26

to generate

33:27

a AI precision lag and all of that.

33:29

So not every provider is mature.

33:32

There aren't so to say many platforms that are doing the end to end.

33:36

So that's a real challenge that people will have to deal with.

33:39

So just coming back to the point then.

33:41

So before you could go to a proper search engine, we had to get the right KBR

33:45

equals

33:45

right knowledge base.

33:46

At this point in time, I'm talking about 3,500 odd different pieces of

33:52

information that resides

33:53

in our database.

33:54

Then there is internal material that we have on top of it.

33:58

So we had to get that right.

33:59

We first got that right, we went ahead with X fine together, search right.

34:03

Now we didn't stop that.

34:04

We really wanted self service to become more meaningful.

34:07

And I would say that, you know, see when days amongst the front runners when it

34:12

comes

34:12

to adopting generated AI.

34:14

So we went ahead and built our own generated AI based bot.

34:20

We call it agent assist.

34:21

And I saw agent assist somewhere and I'm like, really?

34:24

I mean, we thought like, we just created this.

34:27

There you go.

34:28

We created our own bot as of today.

34:31

I think we run 60,000 queries and it's performing at about 85% accuracy.

34:37

So doing a very good job for us.

34:40

That's really made our agent sufficient.

34:41

But what from a customer's perspective, we have got reduced wait times, pointed

34:48

answers.

34:48

Things really getting done for you when it comes to self service.

34:51

Now because we experimented with our internal bot and this will not a customer

34:56

facing bot,

34:57

we tried it for about a year.

34:58

And then we went ahead and very recently, like 15 days back, we launched our

35:03

first customer

35:04

facing generated AI bot.

35:06

So here we are now with a customer facing bot, which is actually doing again a

35:11

terrific

35:12

job, but we had built this group of concepts.

35:14

So that's what we have done.

35:17

And as I talk to you, we have our prototype ready for a KB bot.

35:23

So this one, what it does is when it is not able to find, when any of our bots

35:27

are not

35:27

able to find an answer and the rep already knows the answer.

35:30

So he or she is actually putting it down, just comment.

35:34

This bot actually picks up all of that information and it'll do one of two

35:37

things.

35:38

We either it'll create a new article for you or it'll say, hey, you know, the

35:42

article

35:42

already exists.

35:43

It's just not catering to this particular scenario.

35:46

So you can just go ahead and feed it in here.

35:48

If you let me just say, okay, and I will go ahead and update the existing

35:52

knowledge

35:52

of yours.

35:53

And this is Slack integrated.

35:55

So imagine the convenience in Slack type of name and there you go.

35:59

So that's the most recent thing that we're working on, but that's really been

36:04

our kind

36:04

of transformation journey from almost kind of like everything, non self service

36:11

to here

36:12

we are where we are really going self service, but we don't want to hide behind

36:16

self service.

36:16

We want it to be a channel.

36:18

We want to be digital first, but human led.

36:21

So we let our customers decide how they want to be serviced and we're there to

36:24

service them.

36:27

That's fantastic.

36:29

There's an item in there that I loved.

36:32

The fact that you focused inward first and you, you know, you put the emphasis

36:38

on helping

36:38

the agents, getting their experience down.

36:42

I think you said almost for a year before you turned out to the customers.

36:47

And so you validate, right?

36:48

The customers don't really want ever to be experimented on.

36:53

And when you consider it from that lens, as well as taking great care of your

36:57

agents

36:58

and helping them in their day to day job, you can't help that win and be

37:02

successful when

37:03

you go in that order.

37:05

And that's very good.

37:06

So you became a customer of ours, thankfully, last year.

37:11

Just quickly, if you can, tell me what excites you the most about seeing us and

37:17

an X find

37:18

us come together.

37:19

What does that do for you as a buyer?

37:21

What does that do for you as the leader that you are?

37:25

Why is this a good thing?

37:27

I absolutely love this.

37:29

And I should have covered it in the first question.

37:31

But let me first quickly reflect on that comment of yours.

37:34

For me, my first customer is my team, my employees.

37:39

If I'm able to make their lives comfortable, if I'm able to give them a great

37:45

experience,

37:45

they will ensure that the customers are getting a great experience.

37:48

And it's not like people don't come to work to do a good day's job.

37:53

The problem is sometimes we're not giving them the tools that they need to be

37:57

successful

37:58

and to make the customer successful.

37:59

So it's on the leader's first way, with their people to be able to do a good

38:04

job.

38:05

And I think with a lot of this technology, that's exactly what we have been

38:08

able to do.

38:09

We have been able to equip our people to get the answers that they can provide

38:15

to the

38:15

customers to solve the query.

38:17

So that said, yeah, I should have spoken about support logic.

38:23

So part of our journey really was that, okay, we have that quality team with

38:29

all that we

38:30

could do, we could analyze one to two percent of the transactions that we're

38:35

talking about.

38:36

And I'm talking about half a million transactions in a year.

38:39

So that's a lot.

38:40

There is no chance that we're going to be able to audit all of these and stuff

38:45

like that.

38:45

So in our quest to be digital first, we were looking at something that, hey,

38:50

give us something

38:52

that kind of schemes through our case days and goes through it and is able to

38:57

do something

38:58

about it.

38:59

In that journey, we'll also try speech analytics.

39:03

We'll translate your calls into text, analyze it, all of that.

39:12

But frankly, we did not have much luck with it.

39:14

It was not something that I would term as a success.

39:18

And very recently we launched support logic and tell you not just me, my entire

39:23

team,

39:23

leadership team, we're super excited with what we're seeing.

39:26

Here's what it is doing for us.

39:28

So half a million cases in a year, it's going through our cases as a talk and

39:34

it's bucketizing

39:35

them into product opportunity, follow up requests, urgency, happy customer,

39:46

sentiment scores.

39:48

A lot of that is exactly what we need.

39:50

The minute we see anything in that particular bucket, I have my leaders, hey,

39:54

just put it

39:54

up, let's look at it, we'll talk to the person.

39:57

When at a point, we're kind of creating alerts that the same case, we'll get

40:01

routed back

40:02

to the same person who actually handled it and say, hey, you know what, the

40:05

customer's

40:05

happy, but the customer has an additional query or the customer wants a little

40:10

more

40:10

of help here.

40:11

This is the current status on your case.

40:14

So now, you know that the rep already knows that, hey, this customer, I kind of

40:18

was able

40:18

to solve one part of the problem, but there is something else I need to help

40:22

with and it

40:23

is urgent and which bucket it falls in.

40:25

So it's allowing us to do a lot of things, prioritize, pick up sentiment,

40:32

identify and

40:33

really kind of stack rank, product things, you name it and we want that

40:38

intelligence that

40:39

was humanly impossible for us to gather.

40:43

So I think it's a next-level kind of thing that we needed and when it comes

40:47

together

40:47

with X-Find, once you guys are truly able to integrate it, I think it will be a

40:51

great

40:52

platform play.

40:53

And there should be many takers for that.

40:57

Wonderful.

40:58

Thank you so much, Finite.

40:59

Thank you for being here with us today to share your story.

41:03

I think there's something so powerful when industry leaders can come together

41:07

and I'd

41:07

not be shy about discussing the fact that, hey, I had a challenge.

41:11

I had a problem and, you know, I wasn't perfect to begin with and I went out

41:16

and I, you know,

41:17

working to solve that problem, I think that transparency is so important and I

41:24

really,

41:25

really value you as a customer and as an executive in our industry, Finite.

41:29

Thank you.

41:31

So I'm going to move forward a little bit.

41:34

We have a lot of amazing content that we've put together for this session and,

41:41

you know,

41:42

people who are attending, I believe you'll be able to receive copies of these

41:48

slides.

41:49

Certainly we can make them available as well.

41:53

But we did stack this deck together and really had a lot in here.

41:57

But I think, you know, where we're going today and where we want to end up, you

42:02

know, with

42:02

all of this content and the session that we've had is to really just recognize

42:09

that today,

42:10

you know, there are multiple paths that customers take and engineers take to

42:16

get to the answers.

42:18

And so when we think about this from that standpoint, we have portal assist,

42:22

the chat

42:23

bot assist, Slack assist and CRM assist, you know, and each one of those is

42:28

benefiting,

42:29

you know, either the internal teams, the support teams, the end customers, you

42:34

know, the possibilities

42:35

are truly, truly limitless.

42:38

And, you know, if we can start to think about, again, that mega database that

42:42

we have sitting

42:43

there with our knowledge answers already in it because we've solved the cases

42:48

before.

42:48

And we don't have to go through this painful process of tag that case and make

42:53

that case,

42:54

submit an article, get the article reviewed and published and authored and our

42:58

agents

42:58

can literally find the information in the case management system and move on.

43:03

I think, you know, for all of us to take away today is how powerful that is.

43:08

So John, I'd love to turn it over to you as we wind down in this last minute or

43:16

two,

43:16

you know, and let's talk about the fact that we, again, let's talk about what

43:22

knowledge

43:23

management makes a knowledge management project successful because I think that

43:27

's why a lot

43:28

of people are here to learn that.

43:30

Well, that's a big topic for the minute we have remaining, but I think it is

43:39

really understanding

43:41

what the requirements for the knowledge are.

43:43

I think that that has definitely evolved a lot, not only from the vendor side

43:49

with rolling

43:50

releases and monthly releases.

43:52

The timeframes are getting a lot quicker as well as looking at some purposes

43:57

for knowledge

43:58

management such as understanding what customers are struggling with.

44:02

So we can make improvements to onboarding or to the product itself to eliminate

44:09

some

44:09

of those.

44:11

What do you think are some of the most strategic elements of a KM program?

44:16

Well, let's get the buy in.

44:19

Let's get the buy in across the company.

44:20

I think that that is vital and we have to stop again looking at it as just a

44:26

support

44:27

issue alone.

44:30

Again, having domain expertise is absolutely so, so critical.

44:38

And I think for me, John and you and I have talked about this for years, can we

44:42

please

44:43

remove the heaviness of what knowledge creation and content creation and

44:47

content cleanup feels

44:49

like?

44:50

There's some amazing methodologies out there, but just like product development

44:55

, get into

44:56

the agile way of doing things, remove the multiple layers.

45:00

I think that is going to be so critical moving forward.

45:03

Tools like this are going to help with that finally.

45:07

So hopefully, in a couple of months or a year from now, John, we're smiling

45:11

because

45:12

everything we've talked about for the last 30 years has finally happened.

45:17

Yes, if we can get over the companies that take six months to publish an

45:21

knowledge article,

45:22

that would be a fantastic first step.

45:25

Well, Vanessa, I know we are up against time.

45:28

Do you want to close us out?

45:31

We are.

45:32

Yes.

45:33

Thank you so much, John.

45:34

Just a reminder, we had a ton of questions come in.

45:36

Unfortunately, we don't have time to answer them here live, but don't worry.

45:41

We haven't forgotten about you and we will make sure to follow up.

45:45

And with the since we've come to the conclusion of the webinar, a few more

45:48

reminders, there

45:49

will be an exit survey and we'd appreciate it.

45:52

If you could please take a few minutes to provide your feedback on the content

45:56

in your

45:56

experience by filling out that survey.

45:59

I know the link to the recorded version of today's webinar will be sent out

46:02

within the

46:02

next 24 hours.

46:04

Goodie, Sarah, Elle and Benite for delivering a fantastic conversation and to

46:11

everyone for

46:12

taking the time out of your busy schedules to join us for today's live webinar,

46:17

the Knowledge

46:17

Gap, how to deliver fast, accurate answers to customer support problems brought

46:23

to you

46:23

by TSIA and sponsored by SupportLogic.

46:27

We look forward to seeing you in our next webinar.

46:29

Take care, everyone.

46:30

[BLANK_AUDIO]