Each quarter we meet with 14 selected Posh clients, the Posh AI Product Council, to hear what pains them most, how Posh Conversational AI has helped, and where we could do better. To no one’s surprise, ChatGPT dominated this quarter’s discussion. I want to share here a taste of what we discussed and heard.
We started by discussing what ChatGPT is. I personally believe ChatGPT is an important, but incremental, step in the evolution of AI. Magnus Revang, Research VP at Gartner, articulates this point well in his post Why ChatGPT will not Replace Customer Service Bots Anytime Soon!
ChatGPT is based on OpenAI GPT-3 which has been commercially available for almost two years now. While incremental, ChatGPT nevertheless marks a notable turning point. ChatGPT is more powerful than past iterations, is better tuned for dialog, and, perhaps most importantly, was made easily accessible to the world for endless experimentation as you’ve no doubt witnessed. We at Posh continue to experiment with new AI technologies including ChatGPT in our quest to help our clients and their end users.
Clients told us they want to apply the latest technologies like ChatGPT to best serve their end users. They want to be ahead of the curve, and yet, they want to minimize risk. We heard concerns about being too humorous with responses, veering off-brand (avoiding offense), and accuracy. This is banking and people’s money. Consider Google’s embarrassing inaccuracy in its own ad (which sent Google shares tumbling). Or consider how Bing AI recently turned rude when insisting the year is 2022, that it’s not a Large Language Model (LLM) (it is), and that it’s not vulnerable to prompt engineering attacks. These are the risks that give our clients reason for concern.
Yet we also heard enthusiasm for the right applications of LLMs in the right places. One possible application we hashed through together is a way to help call center staff quickly find answers to caller questions within a knowledge base. Imagine you broke open GPT-3 and pulled out just the listening/understanding side but skipped the talking/generative side. We have experimented with doing just that using so-called embeddings. We can supercharge our bot’s understanding with GPT-3 embeddings to decipher strangely worded questions, complex questions, or colloquially stated questions. Once a Posh bot understands the question, the bot could return a pre-approved answer from our client’s knowledge base. The risk here is low. The bot would only respond with canned answers and a call center staff member would review the answer for relevance before passing it along to the customer. This may feel like a small step relative to the hype around ChatGPT. In some ways it is. But it could actually be really helpful for overwhelmed call center staff. And it’s a lot less risky than setting ChatGPT loose on your customers’ bank accounts.
I should note, all Posh bots already make use of embeddings as part of how they understand end users. We evaluated GPT-3 early last year but ultimately stuck with a simpler language model for this primary use case. We are connected with the AI community and are always evaluating new approaches. But we also acknowledge these AI innovations must be adopted with care, so that our FI’s clients can have the best, and have it alongside their peace of mind, too.
Dave Bergstein is the Director of Product at Posh. He’s been in the technology space for over 20 years and spent the last 10 in AI-related Product Management roles. He holds a Ph.D. in Electrical Engineering from Boston University. In his spare time enjoys being with his family, CrossFit, and walking his dog, Zeus.