In the past few years, voice and chat virtual assistants have been deployed within every single top 20 life science organization. Virtual assistants are projected to handle 75-90% of healthcare & banking queries by 2025 and are currently being created for an array of patient needs including diabetes education, oncology treatment on-boarding, and financial assistance for arthritis medications. We discuss the evolution of conversational AI in healthcare, current trends, and that hot topic that people can’t stop talking about, generative AI.

Note: MM+M uses speech-recognition software to generate transcripts, which may contain errors. Please use the transcript as a tool but check the corresponding audio before quoting this content.

Hey, this is Eminem’s Marquis editor at large. Welcome to this. Mmm podcast titled beyond the body on the improving omnichannel patient and hdp engagements with conversational AI sponsored by ipg health.

In

the past few years voice and chat virtual assistants have been deployed within every single top 20 Life Sciences organization virtual assistants are projected to handle 75 to 90% of Health Care and banking queries by 2025 and are currently being created for an array of patient needs and clean diabetes education oncology treatment onboarding and financial assistance for arthritis medications today. We’ll discuss the evolution of conversational Ai and Healthcare

current

trends and that hot topic that people can stop talking about generative Ai and we have two guests on hand who live and breathe this technology to discuss the topic at least Whitaker who’s SVP director of conversational experiences at FCB health and Brett, kinsella founder and CEO and voice spot.ai and syntheia.

Police and Brad, welcome to the mmm podcast

next slide Mark.

Thanks for having us

forward to this discussion. How about we just start out with having you both kind of introduce yourself give a brief background of your professional areas of Interest what you are currently doing professionally and your connection to conversational AI at least have it we start with you.

So my background really has been in digital transformation and user experience for my entire career. I currently have been working within the conversational AI space specific to healthcare and pharmaceutical for about the past five and a half years and really spent the majority of my time in cpg until my older son was diagnosed with acute my Louie leukemia when he was four and a half. He’s a Survivor, so it’s all great. But at the time it was extremely just

Crazy, not not only obviously, you know his prognosis and what he was going through but also because there was such a lack of communication and understanding and for me as a caregiver, even though I had a sister-in-law who was a pediatrician at the time and a very good friend who was an oncologist and I was able to rely on them for information. I was still so confused. Nothing made sense that I didn’t understand terminology. I didn’t understand the drugs the generic terms that would look them up. I didn’t understand generic versus branded. So once he was in survivorship, I just felt like there’s got to be a way for me to take all of my professional experience work within the healthcare industry, you know with my personal experience and try to merge those two to really like better the communication for patients and caregivers in this space and I’ve just kind of like brought that passion into work for the past 10 years.

I’m glad to hear that your son is now as you say it’s survivorship and sounds like a wonderful would emerge as you said you’re

Insulin and professional experiences into this new chapter Brett. How about yourself? Well, I’m really excited

about Lisa’s story and like how some of the Technologies work and would be so helpful today that she really didn’t have as much access to a couple years ago. So I’ve been working in the AI space since 2013 and really what I’ve done throughout. My career is work on new platforms back in the day of the web. Remember that the 90s when that was Hot Hot Topic and social and mobile a lot of Enterprise Technologies, but I really started focusing on AI about 10 years ago and in particular, I think probably best known for our work.

In conversational Ai and generative Ai and that comes through a couple different places founded a publication called voicebot.ai in 2016 and really focused a lot of filling the Gap around voice assistants and conversational AI Technologies. It wasn’t being properly covered and the market and that we’re probably known for that news, but we’re also known for research reports. We have more data on what’s happened in the industry around consumer adoption things like that than anybody else and we started covering generative AI in 2017 and then obviously that picked up a lot in 2020 when gpt3 originally came out and wowed everybody in the midst of the pandemic and we founded another organization last year in the summer called syntheia, which is about generative Ai and synthetic media. It’s really just focused on that. That’s a newsletter. And so that’s what we do. We help to educate the market. We help people understand the business of those technology spaces, but also the application space use cases what people are really getting benefits from you

know, we’re living

In this Amazon Google Snapchat world and where consumers have gotten used to and really expect Speedy prompt answers and efficient communication. I mean both of you have been immersed in conversational AI since the infancy of the technology, but I’d like to know how you see conversational AI evolving over the past five years. I think we’re living in a chat GPT world right

now and I would expect most of the people who are listening today are familiar with it. If they’re not they should check it out by little company called openai which was acquired by Microsoft or at least they bought half of it which was effective control. So when we think about conversational AI a lot of people know from the consumer standpoint the Alexa’s the series The Google assistants, they have these voice assistants that they’ve interacted with or they know it from website chatbots. And the whole idea here is it’s a little bit different in the past where we typed in or we Press buttons or something like that. Now we could just use natural language and we can converse

Now more recently when we talk about chat GPT. It’s it’s really an extension of that and started out first as text. And we’re now seeing a lot of this voice interaction just this week Microsoft being implemented its voice interactive solution perplexiai some others already allow you to do voice input and the gender of AI is really just an extension. It’s the next evolution of this of this idea that we can talk like we’ve talked to humans and these assistants will get back to us. They’ll respond to us almost as if they’re a human expert and whatever category we’re asking them about. And so that’s one of the most interesting things that has happened and really what it’s doing is it’s unlocking access to information that used to be really reserved for people who are already experts in the field because they knew where to find it. They knew how to interpret it and now we have a system like the traditional voice assistance would help you find it more effectively, but the generative AI assistance, they’ll summarize it they’ll give you a real world examples of how it’s used. I’ll explain to you like you’re an eighth grader. So it’s easier for you to comprehend.

Core Concepts and really explore and expand

your knowledge base around areas that are important to you.

Plus they do all sorts of other things which like they’ll write poems for you and things like that, which is really fun and also particularly useful and so like there’s just a ton of use cases out there that people have whether it’s writing Assistance or research and search or you know, it’s changing the way we consume the web as well.

Didn’t we have some comment last week that?

People are even writing letters to their significant others being happy.

Of course. Well, there was a South Park episode on this. So everyone maybe it’s just parity but there’s no technology is legitimate until Southpark parodies it and it only took four months for maybe not even I think it was three months before Southpark period That’s how fast this move just in case people don’t know chat GPT launched on November 30th.

And it’s kind of amazing. If you haven’t checked out you’re really should because you can ask it all sorts of questions to do things. And there’s a lot of other Solutions. You should try too which we can talk about if we want later but

It was the fastest application to 100 million monthly active users in history. The next closest was tiktok, which took nine months

check GPT did in about six weeks. It’s astounding you know, the adoption is incredible just kind of broke down a wall between the consumer and this technology. What do you see is the key use cases

Brett. Well, when we think about conversational assistance, there’s a lot. So if you think about what Alexa Siri do they allow you to usually it’s task oriented. They already actually keep tasks very quickly. The conversation is just natural language input. We think of these as one shot interactions you just ask it to do something and you get get the response back turning on the TV sending a message, whatever it might be.

When we look at generative AI it’s more about the knowing function. So you think about Adam chayer who is one of the founders of Siri and then Viv Labs which became Bixby he talks about the doing and the knowing functions of the assistant. So the voice assistance to data been very good at task execution and a little bit. Okay in terms of knowledge the generative AI Solutions are really good at The Knowing part. They can look at Reams vast amounts of information and give it back to you and most people have interacted with it in such a way that they’re

They’re looking at information that they might find on the internet, but there’s other applications as well. So like one of the best use cases and if you haven’t tried it everyone should

Take an article or a research paper, like maybe a medical research paper copy and paste it into chatgbt or one of these other Solutions. It’ll give you a summary but you can also ask a question. Like what did it mean by this or does this report talk about this type of condition or those types of things or you can take four articles to paste them in together say could you combine these and summarize them? So that’s like a great use case writing letters emails to maybe to your loved ones maybe to other people professional writing YouTube headlines like because we all need better YouTube headlines writing blog posts and product copy. So there’s a lot the AI writing assistant’s really big but one of the things that’s really revolutionizing technology is the code writing assistance software code these systems. You can just type in the type of function you want it to write, you know, and you say you want to write it in Python. They’ll give you the code or you can upload your code and you say hey this code isn’t working what’s wrong and it’ll actually help you.

Debug, it’ll tell you what what you think the functions are and so if we think about things that have already changed you’re writing obviously.

And we have code we have images you can just type something in you can get an image generated. Like it’s a whole nother part of generative Ai and then search and I use it every day. I use being chat. I use perplexity. I use the new Google search gender of experience. Occasionally. I’ll use Bard as well. I’m saving about an hour a day in search because I have to do a lot of research in my work as a publisher. And so it’s helping me find those little things that just took me a long time because I would search and find some links and I’d have to read all these articles and try to it’s helping me get to the answers much faster and I think much richer material than you know had in the past

right that’s amazing. We should talk offline right being here in the publishing business and I’ve heard also, you know doctors are using it, you know to write letters to health plans to dispute coverage decisions that they’re not happy with but moving over to you at least what do you see as some of the Pharma specific use cases and you know kind of if you want to spice and and some of your observations and how you’ve seen this technology evolved in healthcare space of the

Years, that’d be great to hear as well.

Yeah, so taking a step back kind of thinking back focused more on conversational AI it’s been interesting to see the evolution because I feel like five to six years ago.

We were considering that early adoption in Pharma and Healthcare of virtual assistants. So when I think about one of the very first solutions that I worked on that we deployed it was for

a top five Pharma company for Med Affairs because they needed a way to augment the call center. So the call center at the time I think one called a call center was averaging 78 dollars.

For that organization, so we were at the time utilizing conversationally. I to triage those frequently asked questions to the call center by hcps and allowing them to self-serve without having to dial the 188 number and that was for a portfolio of products and we got that cost down. I think it was twelve dollars and inquiry so massive call center savings when you really think about that. But the cool thing at the time was it was almost so new that

the company was willing to take the risk, right there weren’t enough articles out there. There wasn’t enough fear yet in place. I believe at that time that this very large pharmaceutical organization felt comfortable to deploy this solution and then over time obviously virtual assistants chatbots with open field text. And then there was voice and voice was a super exciting time for all of us. I know Brett’s mentioned this, you know in different talks. He’s he’s had but just

I think we all thought voice was it. I don’t know about you. Right like I felt like in two years every single search is gonna be via voice and people are only going to use voice. They’re not even look at screens and you know, that was my prediction and but it was cool. I mean we were deploying public voice skills for pharmaceutical Brands one was for in our rheumatoid arthritis drug.

I won’t mind mention the company or brand but at the time we we deployed in Amazon Alexa skill and the only guard rail really was you could not provide dosing or Administration information.

So it was asking answering those questions like my syringes cloudy. What do I do or I need a new travel pack. How do I order one or how do I get a new Sharps container or how do I refrigerate my injectable? So it was just answering those basic questions and there were a lot of cost and coverage questions. We included within that boys skill and similarly. We did, you know more of an awareness skill for a breast cancer mutation and just being able to answer those frequently asked questions about this mutation. So a lot of like just unique interesting ways to provide that information to Consumers and/or patients and caregivers in a way that we felt they were going to consume the information either then or soon thereafter, and then I think obviously

I was done. There was a lot of risk concerns over open field text going back to Virtual assistants again. A lot of those guardrails started to strictly get put in place from Enterprise company to Enterprise company who would allow an open field text and just so everyone’s aware. I’m sure anyone listening because you’re all you know, everyone’s in healthcare Pharma. The biggest concern with open field text is obviously Adverse Events and that a patient will mention that they have an adverse event that could be life threatening that’s going to go on reported or not provide them with the correct answer and that could be life-threatening and detrimental. So obviously the real concern is valid. It’s very valid, but what we were doing we were very actively creating solutions that had features where we had adverse event detection. If we were deploying a solution with open field text any user word to type in anything that even at the lowest threshold might possibly say be

Something about an adverse event. We had a solution in place that could detect if someone might be suggesting an adverse event reply with them to confirm. If in fact that is potentially an adverse event and then quickly escalate it to the correct parties at the pharmaceutical company. So that’s what we were doing at that time in the evolution. And then I think the last thing so we went through all of that then covid hit then there were different needs in place.

And now I feel like just specific to conversation. I we’ve gotten back down the curve again where a lot of pharmaceutical companies will only approve a button-driven experience. Not all but I feel like we’ve gotten back to like a lower plane again

just deployed something with the virtual human. That’s a conversation.

Well, so I said a lot a lot of great great point though. Um, I will say in the past years. It has been challenging because there’s been a lot of boundaries put in place, but luckily what has probably happened. Thanks for that softball Brad. Luckily what’s happened is that slowly? The guardrails are coming down dependent on the organization. It’s it varies from pharmaceutical company and pharmaceutical company. And now there is a big ask for digital people we can talk about that more later but a big ask for digital people.

Avatars those all have voice. It’s a nice hybrid between a chat solution and a voice solution. I like to say allows users to actually interact in way. They prefer if they prefer to have a video on and facial recognition awesome if they want to type in or use buttons and not every use of microphone they can do that as well. So there are new technologies and that’s probably where we are right now and then of course Genai,

yeah, let’s this drill down a little bit further into that at least, you know, we’ve seen the last couple of years this shift toward personalized patient care and at the same time as you said, you have conversational Solutions being deployed within the top 20 Pharma companies, you know, maybe there’s been a little bit of a free trenching there. As you know more those articles have come out as you put it but how have you seen the farmer Brands evolve toward a more patient-centric approach using

digital tools.

Yeah. So I think not to be cliche, but I think covid really changed patient care and put the illness more on patients because number one it had to because it had to become virtual care for a while and

don’t care but also because as time has evolved patients have

they’ve started to really own their health care more and more. They’ve become just smarter and wiser and a lot more educated. I would say younger Generations. They’re very educated. They want to own more of their health care decisions. And so with that has come a need for digital tools that patients and caregivers can use to manage their care. And so what we’ve seen with conversational AI that’s been really awesome is taking those quote basic and this is where beyond the bot happens right taking that basic FAQ button-driven chatbot and turning it into a very meaningful experience where a user can completely self-serve on their own 24/7 by asking a question and getting an answer but also by having authenticated experiences for patient support, so

Basically using conversational AI to expand any patient Support Program you allow for an authenticated experience from the time a user on boards onto a new protocol and then you can really have conversational AI holding the hand throughout that Journey whether it’s providing onboarding information or sending reminders that they have to fill a prescription in a week. You know, we know that

Certain certain drugs have like a 60% drop off when they’re prescribed in the ER.

60% drop off after that initial fill in the hospital and then a lot of those patients don’t refill.

For things like afib Etc. So how do we help to bridge that Gap? What does that communication tool in place to help those patients to manage their care. Once they leave the ER that’s just one example

sure, and you know, you know some of us with experience with specialty drugs know that they typically they’re they are high-priced Therapeutics, you know, the industry has been shifting more towards Specialty Products and each patient that goes on these products is potentially lucrative source of you know revenue for the drug company. So, you know, there’s probably a fair amount of hesitation about in terms of to be allow a chatbot to kind of manage that process of onboarding and inherence. So to speak versus a human but it’s really it’s really interesting to see the evolution there Brad. I mean you talked about earlier, you know, how big players like Microsoft with being Ai and Google with Bard and open AI. Of course the chat gpt4 are making this AI chat technology that was previously restricted to the test lab

More accessible to the general public there could be some confusion in terms of whether those are quote unquote chat Bots and what’s the difference between those and say a chatbot deployed on a brand website which is kind of what Elise was referring to maybe is that like button activated chatbot? Can you explain the difference? Well, if you

think about a chat about that you might have interacted with or voice assistant in that case what you can think about is it’s a user interface first and foremost. So the question is like how can you interact with it in the old days we could go to a website we could click links and maybe in some places. This is still true and some of the healthcare world where you can’t ask a question unless you call somebody like online. It’s just it’s all links and and buttons and those types of things to find information. If you want to do any type of self-service and then you have to talk to a person maybe there’s some smart routing in your IPR system or something like that if they call him

The chatbots are really this idea of being able to take first and foremost natural language input so I can speak as I do as a human because I don’t have all the words that you have in the in the healthcare industry. I’m not an expert in that case. I say it what I think is important what I think I’m asking for and then the natural language systems are so supposed to be good at matching your intent and so once it matches your intent, it is already has mapped all of these intents like the things you want to information that’s pre-approved.

And then it just provides that to you or if it doesn’t have anything it’ll do what they call fallback and it’ll say oh, I’m not sure I can answer that and maybe ask a follow-up question for more clarification or direct you to you know call in to answer that or something else. Okay, so that’s when you think about the the Bots that most of the things that you’ve interacted with up to this point. You have this idea of variable input as opposed to fixed input and but it’s really structured output because the AI is working on the front end. It’s understanding what you’re saying what you want and then it’s thing. Okay, based on what you want. Here are the things I know I could give to you in terms of response whether it’s a task execution or it’s information.

Then when we move on to Bard and Bing and all these other things that are going on with the giant companies.

What they’re doing is they’re actually enabling variable output as well. It’s still variable input. It’s trying to understand what you’re what you want in some ways and like the word understand I use it Loosely because it’s a little with these systems operate. It’s a little bit different. We think about natural language understanding versus large language models.

But then the output is it’s going to basically predict what the right answer for you is it’s not going to a database with pre-approved information. It’s going to a database that was trained often on pre-approved information and I should differentiate something here.

So you think about chat gbt everyone says oh just trained on the internet. So you never know what’s going to come out of it.

Systems when they’re deployed in an Enterprise workspace are generally trained on data from that Enterprise.

So it might have learned on the internet to understand the structure of language like what words mean and like how to how to think about putting words together and effective way to explain to you what’s happening.

But it’s actually looking at a what what they call a vector database of your information. And this one we think about when we build a FAQs like FAQs are like great medical documentation are great in this so then it’s looking in there for the answer and very often that we think about as retrieval models. And so it’s going in there and it’s trying to formulate then an effective way to communicate to you based on the information. You’ve already provided can it be incorrect? Can I come from solar places? Yes, there are these things do happen. But that’s I think that’s an important point of clarification. It’s not just all here’s the information from the internet. Mostly the way this is being done in Enterprise is Healthcare consumer Brands media across the across the gamut of Industries is it’s actually using the technology that allows you to communicate effectively in natural language back for this open domain like broad questions things. You never even considered might be asked and it’s actually looking at your approved data and trying to formulate a response that answers the question and sometimes

Are supplementing with things outside of your data if that’s how you set up the system

if I can add to that because I think you’ve touched on a lot of important.

We are at that point right now where we want to get to the point of being able to have every pharmaceutical company have confidence in having a quote open field text box or microphone where user on any interface can just quickly ask a question and get that answer without having to click a button without having to have dialogue path created for them.

but I think

there’s a happy medium right now. So when we talk about nlu natural language understanding to then provide the correct answer like Brett mentioned.

It is to a point scripted if you will and that we create knowledge bases at the moment where we are only using content that then goes through mlr and gets approved.

So for us it’s very very important that we stress that.

Anything we create does get approved through mlr and no content. No answers ever given to

An open question that hasn’t been approved by mlr now we’re the you know machine learning comes in and we’re gen AI can help Beyond machine learning is those in context tools that we are allowed to use to help create?

Utterances or to help create additional answers or to help me with that multi-question. So someone asks about dosing and then they ask about dosing for a five year old who’s also on these dry. You know, that’s where we do have the ability. It almost is used as like helps with time efficiencies. Like Brett was saying we can use Genai Tools in a very legal way to help us to create more synonyms. So that the back end can understand if someone says a targeter or a bandage those mean the same thing if someone says addressing is it in the Right medical context that we’re not referring to salad dressing. I mean, it’s a Cheesy, you know example, but it’s an easy one. We’re not referring the user’s not referring to salad dressing. They’re saying I have a lot of redness around my dressing so but that’s an easy explanation of why

So important that the back end can understand what the users asking and be able to answer it correctly. So in anything that we’re doing currently at ipg Health, we are ensuring that if we are using Jenny I tools they’re only being used in a way that is helping us to build out additional.

Things that we need to train the back end. We’re not using it to pull content from scratch. We’re using it to sift through the existing content and create outlines or create initial FAQs. And then we’re then adding to that content sifting through that content adding conversational design and running that all through medical regulatory review. Yeah,

just to like build on top of that.

You can create.

These systems will help you create 50,000 variants.

Very quickly, like within a few minutes of all the types of things that people could ask that then becomes really helpful in your nlu-based.

Conversational assistance training them to understand a wider variety of ways people could express this. So I generally think about the creation side you can use these tools to help you create more effective Solutions, even though they might be more rigid like the older systems you can use them also to interact with a customer or patient in this case who asks it and then they’ve sort of these open-ended input and open-ended output and there are certain classes of items or certain classes of questions, which are going to be fine for that. There are other classes which you can have what they call guard rails built in that if it hits a certain type of class of question, then it doesn’t it only responds from your structured database other things you might be able to do more conversational variable output. The third thing is there’s a lot of internal tools that are being used here. So for example agent assist so people who have been in the context Center World understand that the agents are always looking at knowledge bases, right? They ask a question. They don’t know the answer to everything that’s being asked what the

Language models right now are listening alongside them.

And giving them suggestions on what the person means and what the topics might be and then that can even give them an answer or can have them link to their their database so it can make them more efficient on proper call resolution. And as well as the speed of resolution because we have all had that experience where the person on the other side of the line just doesn’t understand what we’re asking about or here’s the other great thing because we’ve talked to the third person now already in our phone tree experience journey, we’ll call it the third person gets a the large language model summarizes everything that was said to the other two agents that I already talked to so they don’t we don’t have to explain everything back to them. It’s something that they can get up to speed very quickly. And so there are a lot of these types of tools that are being used. It’s not just writing a love letter or writing a YouTube headline or a blog post. A lot of it is actually these core problems that we’ve always had trouble with and particular if you think about in the context Center, there’s so much turnover six months nine months.

Average they don’t see enough of the variety in their tenure there. It takes a long time to ramp them up. This can actually make them productive faster. We’re already seeing that.

Yeah, and I think it also helps with accuracy and and I go back to the time efficiency and any time efficiency results in cost efficiency. So when we’re able to use some of these generative tools on the front end to help with that content creation, it’s going to take us less time which means the end product is going to cost less to the client or customer. So, you know across the board there’s efficiencies and I think even utilizing tools and working with partners that we’ve already been working with who themselves have been building creating features within their platforms that utilize some form of Genai in a safe way has also helped us. There’s no I mean no plugs but like voice flow is definitely doing it helping us to create utterances orbiters doing it with some you know,

Helping create their content and context trying to think who else there’s a g AI that company has been utilizing generative AI so I definitely think it’s going to benefit all of us because you just in the short amount of time. It’s been just wonderful to see so many partners.

Somehow integrating it into their platforms and I can only imagine in another six months what that’s gonna look like too.

Yeah. That’s our amazing.

Comments and important caveats for people working in this industry as well. At least just to stay with you here for this next one. You know, we’ve all seen the shift the major shift and how field reps are engaging with and educating each CPS in this low C slash no see Dr. Post-covid world. How do you see digital Technologies specifically conversational AI assisting these field reps and reaching over bird and hcps

sure so it’s an interesting question because I think when it again when I kind of started getting into this it was all about my passion wanting help help the patients who at 2AM are searching for information their doctor Googling get the wrong getting the wrong information. And what’s interesting is throughout the course of the five and a half to six years that I’ve been working specifically in conversation. I there it’s been awesome to just work on so many hcp solutions and really hcp Solutions. I I would say we’ve deployed even more

Those in the course of the past five plus years then patient because of again going back to that risk.

And I think Pharma feels more comfortable using open field AI for hcps in the short term than for patients because there’s less risk with adverse event reporting. So really what I’ve seen with hcps. It’s kind of cool. I mean again pre-covid. We already had a very low C population growing because of the overbird in place on hcps with additional admin tasks with EHR burdens Etc. So as they had more tasks put on their day reps obviously have had less time to see them and then when covid hit no reps, we’re seeing https face-to-face so digital tools that weren’t once allow we’re allowed we’re always starting to become okay in the space. So some cool things that we’ve used conversational AI for for hcps to help field reps educate hcps.

Definitely digital people which is something I think Brett mentioned earlier. So if anyone’s not familiar, we try to call them digital people and not humans because we never want to make the assumption that it is a human on the other end of the line, but creating customized avatars that speak to you very authentically video on the it feels like you’re interacting with a person so that type of tool is really great for training. So educating hcps either on a new portfolio that they’re not familiar with or educating them on say label changes Etc. So we’ve used it in that way for basically Med Ed conversational tools, like I mentioned earlier for augmenting that call center have then also been used for that quick what we like to refer to as conversational search.

So, how can we help an hcp on demand when they have a quick question to get a quick answer without having to go through a deep meta info website, but quickly just ask a question even into a toolbar and using that predictive search or nlu they type they ask a question Voice or text get a quick answer back. So that’s been used a lot and then a cool thing that some companies have deployed is a chat solution that interacts with an Iva. So can you take that Iva a step further and can you have it?

Virtual assistant or conversational search feature where the hcp can ask a question and then the IBA can flip to that page.

Give the answer and then show the the statistics the graph whatever that relates to that that answer

interactive visual aids. We’re talking they’re right.

Oh, yeah. Thank you interaction. Yeah IVs so and then the last thing is when we think about field reps, you can actually use Virtual assistants to help field reps. So we’ve also worked on Solutions where we’re actually helping field reps to become trained. So now that they are getting back in their cars a little bit more you can still actually use digital tools to help them when they’re driving around their region by almost having like a simulation. So creating a digital tool where that field rep can simulate that experience of asking the doctor questions providing answers having that conversation that practice can be taking place within a car through voice. So that’s just a kind of cool thing as well. So there’s so much

I think it’s right now like 60% hcp 50% patient deployments in the space. Maybe even 65-35.

Wow, that’s surprising I wouldn’t have guessed that

hmm

in closing. I’d like to ask you both, even if it may seem cliche to some what’s on the horizon what’s coming next? You know, where where or what are the biggest opportunities coming up in this space? And what is the next generation of applications look like and the least habit we start with you

sure. So obviously my hope is that we’ll get to a place with large language models where they can be fully trusted and well and again, I think about 2 am having a question getting the right answer. So my hope is that generative AI

Can move forward in a very safe way to move safely into the farm space sooner than later in the healthcare space in the shorter term. I’m really hoping that we’re just able to utilize that open field more and more to really be able to get meaningful Solutions deployed. So that anyone can quickly self-serve in a very frictionless way. So frictionless, meaning the last buttons, they have to click on the less friction. There will be in the experience.

And so we’re all looking for Brett. So

there’s a lot of ways that people are using conversational assistance in general AI today. Let me focus on a couple that I think are going to emerge in healthcare and the implication of that.

One is there’s a lot of call center data that never gets looked at unless there’s some sort of problem and it gets escalated. I’m seeing people right now use large language models in conjunction with natural language analytics tools to sift through.

Enormous reams of call center data that never really you say it looked before and what they’re doing. They’re identifying problems. They’re identifying opportunities. Some of those problems are things. Oh, we could have done this better or we had issues that we didn’t even know that we had but also the opportunities they’re seeing what people are asking about and those opportunities are two two-fold one is they’re helping the agents work better in terms of being able to answer questions better. They’re helping the chat Bots to work better all these other types of things but they’re also identifying new things to automate we are in an era of productivity everybody seeking productivity. We’ve had very little over the last 20 years. This is the next tool it’s we’re talking about a productivity Revolution. We haven’t seen since the 80s. And so if you use tools like human first, they are allowing you to look at that call center data and just did it summarizing it. It’s allowing you to collate it and look at like on a histogram type methodology what’s actually happening in there and then people are using it for decision making going forward. So I think that’s like one of the the really incredible things is happening. The other thing is I think

Going to wind up being a standard of care for healthcare providers and particularly, but I think this might also go into the pharmaceutical world that they’re going to have to have their own co-pilots that they’re gonna have to give cogouts to their customers. They’re going to have to have co-pilots for themselves so that they can actually sift through all this information. There’s more information than we know what to do with large language models the Bots have arrived just in time because we created this problem ourself by just like this explosion of information. No one can keep up with it. And this is the first tool that we’ve seen that allows you to actually synthesize that extract information use it to help you make better decisions. We used to think about this idea of an assistant that does things for you. We also thought about this idea of an advisor.

Virtual advisor and think of that now is the co-pilot it helps you with these tasks. It helps you make better decisions and I’ll just tell people I think

You’re going to see a situation where a lot of this beautiful say. Oh, it’s all over hyped and all these types of things. But then you’re also going to see it’s this pervasively used throughout your life and it’s going to use just look back 10 years. There’s this old saying that

We overestimate the impact of technology in the short term but we underestimate its impact and the long term and this is the most significant shift that we’ve seen since the web the web is really important. We went from analog to digital. It’s like changed everything. It’s hard to even conceive of that now but 30 years ago. We didn’t have most of the things that we have today.

So that was a huge shift mobile was like a change in context for that. But when we think about Ai and generative Ai and protection, this is a new generational platform. So it’s going to have a 30 year run and it’s going to reach into every part of our life. And so I tell people is

you’re probably underestimating how much this is going to impact you and your organization today start looking at it figuring out ways. You can get these quick wins. It is one of those things. It’s like a technology. They don’t have to give the technology away. It provides so much benefit immediately that they you can get these quick wins. So I always tell people like take a quick take a look at it and try it because seeing is believing. Yeah and might from my limited, you know

perspective Brad. I sense that you’re right in terms of that historical view, you know the web assuring in a digital versus analytic era and then you know mobile with the iPhone was was sort of the definitely a blip on the timeline, but this feels like we’re entering a whole new, you know Stadium, you know of possibilities here, I’m much larger sort of fish tank, you know to expand as a society in our relationship with technology. Sorry for those clunky analogies, but

well the Bots are coming just in time to save this from all the problems we’ve created.

Bats have arrived. Yes. I like that.

Well the and it’s it’s funny because we used to say the Bots are coming like six years ago, but now bot means a whole different thing that is not even two years ago or even a year ago. So even just that alone is just mind-boggling

well on that note I said we wrap here we can go on, you know for another hour, but you know, we know this technology is vastly powerful. We just, you know don’t know the use case is yet, but they’re emerging so thank you for explaining that you know, so eloquently and sort of how we can make many of the processes that we deal with day to day more efficient more productive and stay compliant, which is of course a particular importance in the Pharma and Healthcare spaces. So you’re both a wealth of knowledge. Thank you both so much for this was really fascinating.

Thanks a lot Mark.

Thanks Mark.

Thank you for listening. If anyone has any follow-up questions, you can check out voice spot that AI

which

is Brett one of Brett’s websites or of course ipg health.com, which is a company that at least works for. Okay. Thank you to Elise and

Again, and thank you everybody out there. We’ll see you next time on the mmm podcast.