ChatGPT and its implications for the customer experience

Check out all Intelligent Security Summit on-demand sessions here.

OpenAI opened the beta version of ChatGPT in late November 2022, in a move that has produced the most powerful natural language processing (NLP) AI model to date. It quickly went viral, attracting a million users in its first five days.

Will models like ChatGPT completely replace chatbots?

The premise underlying this question is whether large language models (LLMs) such as ChatGPT will transform chatbots’ reputations from being clunky, impersonal, and flawed into algorithms so meticulous that (a) human interaction is no longer necessary, and (b) the traditional ways of building chatbots are now completely obsolete. We will explore these assumptions and give our vision of how ChatGPT will impact the CX space.

Generally speaking, we differentiate between conventional chatbots and chatbots like ChatGPT built on generative LLMs.

conventional chatbots

This category includes most of the chatbots you’ll find out there, from chatbots for checking the status of your DPD delivery to customer service chatbots for multinational banks. Built on technologies such as DialogFlow, IBM Watson or Rasa, they are limited to a specific set of topics and are not able to respond to input outside these topics (i.e. they are domain closed). They can only produce answers that have been pre-written or pre-approved by a human being (i.e. they are not generative).


On-demand smart security meeting

Learn the critical role of AI and ML in cybersecurity and industry-specific case studies. Watch sessions on demand today.

watch here

LLM based chatbots

They can respond to a wide range of topics (i.e. they are open domain) and instantly generate responses rather than just selecting from a pre-written list of responses (i.e. they are generative). They include Google Meena,, BlenderBot, ChatGPT and others.

Table generated by ChatGPT

LLM-based chatbots and conventional chatbots serve slightly different purposes. In fact, for many CX applications, the open nature of LLMs is less useful and more of an obstacle when creating a chatbot that can specifically answer questions about your product or help a user with a problem they are experiencing.

Realistically, LLMs will not be released in the CX domain tomorrow. The process will be much more nuanced. The name of the game will be marrying the expressiveness and fluency of ChatGPT with the fine-grained control and limits of conventional chatbots. This is something that search-focused chatbot teams will be best suited for.

Where can you already use ChatGPT today when building chatbots?

There are many aspects of chatbot creation and maintenance that ChatGPT is not well suited for in its current state, but here are some that it is already well suited for:

  • Brainstorming possible questions and answers for a given closed domain, either based on your training data or tuned to more specific information – either by OpenAI unlocking the fine-tuning capability when ChatGPT becomes accessible via the API or by including desired information through immediate engineering. (Warning: It’s still hard to know for sure where information is coming from, so this development process will continue to require a human in the loop to validate the output.)
  • Training your chatbot: ChatGPT can be used to paraphrase questions a user might ask, especially in a variety of styles, and even generate sample conversations, thus automating large parts of the training.
  • Testing and quality control. Using ChatGPT to test an existing chatbot by simulating user inputs holds great promise, especially when combined with human testers. ChatGPT can be informed about the topics to be covered in your tests, with different levels of granularity, and just like when generating training data, the style and tone it uses can vary.

We see the next generation of CX chatbots continuing to be based on conventional and non-generative technology, but generative models being heavily used in the creation process.

Chatbots are set to level up the current CX space

The main impacts of LLMs on consumer expectations will include greater visibility of chatbots, greater urgency to incorporate them into CX, a greater reputation for chatbots and a higher standard. In other words, chatbots are gaining traction!

We’ve all experienced them – clunky chatbots with extremely limited dialogue options that produce painfully robotic lines (if they can understand anything at all). While underperforming chatbots are already disappearing, standards will now be triggered to prevent this experience, and the shift from human to AI will continue apace.

A recent report predicts that the number of interactions between customers and AI-managed call centers will increase from 2% in 2022 to over 15% in 2026 and double to 30% in 2031. However, given the rapid adoption and exponential advances in AI over the past three to five years, we anticipate the true growth to be much higher.

Brands like Lemonaid, Oura, AirBnb and ExpressVPN paved the way for excellent 24/7 support — so much so that today’s customers simply expect a seamless experience. The consequences of failing to provide great service are no joke. Poor service can have a significant impact on a brand’s retention rates, causing potential buyers to look elsewhere: According to Forbes, poor customer service costs businesses a total of $62 billion every year. year.

Risks in using current LMM-based chatbots

ChatGPT is certainly in a hype phase, but there are significant risks to using it as it is now. We believe most of the current risks stem from the unpredictability of ChatGPT, which raises legal, reputational and brand concerns. While the buzz around ChatGPT is good, you shouldn’t forget the associated risks and the importance of selecting the right partner to avoid pitfalls.

In particular, we see the following risks for large companies that adopt LLMs directly into the customer journey:

  1. Damage to the brand image — sharing offensive content
  2. misleading customers — share fake content
  3. Opponent Attack Potential — people trying to break the chatbot to damage reputations
  4. false creativity —users confusing “stochastic parrot” with creativity/genuine human connection
  5. false authority — ChatGPT produces text that looks authoritative and that humans are notoriously bad at refuting.
  6. Data security and data ownership and confidentiality — OpenAI has information and access to all data shared via ChatGPT, opening huge risk gates for breaches of confidentiality.

In other words: “Just because you can doesn’t mean you should”

Startups and established organizations will inevitably try to introduce safeguards and other measures to mitigate some of these risks. However, many companies, including those we work with, still want (or are legally required to) retain full control of content. Our legal and FCA regulated clients are a good example. With generative LLMs like ChatGPT retaining complete content, control is impossible.

When it comes to the chatbot development itself, players using open source stacks like Rasa or Botpress will have the advantage of agility due to the flexibility and versatility that these open systems allow. In the short to medium term, chatbot developers with experience in NLP and using LLMs will be the ones to bring this technology to the chatbot market, because they are able to effectively leverage and adjust the models to their (or their customers) needs and use cases.

Long-term, small businesses will continue to be better positioned to implement changes quickly than large, established platforms like ChatGPT. Amidst the current financial market volatility, however, we anticipate potential player market consolidation over the next 12 to 24 months, with larger players acquiring smaller players and – a common occurrence in the chatbot space – customers buying their chatbot.

Which industries will first adopt ChatGPT in their CX processes?

Despite ChatGPT only being in beta and no API yet available, there have been a myriad of exciting use cases posted by individuals, including various browser extensions, most notably via Twitter.

As long as ChatGPT is available to the public (we expect a volume-based pricing model to come, as with previous models like GPT-3), small players will continue to push the boundaries with new applications.

Victoria Albrecht is Co-Founder and Managing Director of Springbok AI.


Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including technical people working with data, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data technology, join us at DataDecisionMakers.

You might even consider contributing an article of your own!

Read more from DataDecisionMakers

Leave a Comment