After more than a year of computer clusters running constantly in the cloud to train ChatGPT, researchers found that it became more sophisticated as the data and model size increased. The model even exhibited unforeseen capabilities that it wasn’t explicitly trained for, like arithmetics and translation.
While other AI language models have been available before ChatGPT, OpenAI’s model became especially famous because it was made available to the public via a simple web application, and provides an API for developers to integrate it into other applications. In fact, it only took ChatGPT five days to accumulate one million users after its launch.
The fintech space has been quick to leverage the potential of ChatGPT, embedding the tech in virtual assistants to carry out customer service and support, offer investment assistance, detect fraud, and more. The result? Fintechs can scale, streamline communication, and serve customers faster and more efficiently with ChatGPT. Here’s how.
Where can ChatGPT be applied in FinTechs?
Find the right financial product or service
When integrated into user experience flows, ChatGPT can power search functionality within a website or via a digital assistant. For example, if a customer searches for an ETF to invest in, ChatGPT could ask the customer questions to narrow down their search – such as “do you have a preference regarding the risk level or expense ratio?”. Based on the customer’s answer, the AI assistant can then generate a list of the company’s relevant ETF products and forward the customer to appropriate product pages.
Query financial products and service descriptions
Rather than requiring customers to read through the entirety of the text on a product details page, customers can ask targeted questions to the ChatGPT-based AI assistant for information they need. For example, “what is the investment strategy of this ETF?”. The assistant can provide the answer and highlight where in the description it has been pulled from. This is particularly useful for complicated products or services with legal specifications.
Automate customer service
Through ChatGPT-powered customer service assistants, FinTech customers can freely enter their request and receive quick, simple navigation steps in return. For instance, if a customer writes “I moved”, the assistant can understand the implication and ask “would you like to update your address in our system?”. Likewise, if a customer indicates that they want to transfer money, the assistant can ask about the amount, the recipient’s bank details, and what reference to include.
Whatever the use case, the communication capabilities of any AI assistant can always be extended to support voice messages. That is, voice messages from the customer can be transcribed in real-time and textual responses from the assistant can be synthesized into a natural-sounding voice. This greatly enhances accessibility for visually impaired customers.
A ChatGPT-based fraud detector can recognize nuances in customer conversations based on historical interactions, and determine whether malicious actors are at work. For example, if a customer’s spelling, tone, and grammar are different than normal, the AI tool can flag it as potentially fraudulent and so the assistant will ask further questions to confirm the customer’s identity. This behavioral security can similarly be applied to emails and phone calls.
Know Your Customer (KYC)
Again looking at historical data, a ChatGPT-based analytics tool can produce summaries of customer sentiments, satisfaction levels, and personality. It can provide concrete examples of customers’ communication which FinTechs can utilize for marketing purposes and persona building.
We offer fintech software development services. Please find out how we can help your company!
How does ChatGPT work in FinTechs?
ChatGPT is complex and powerful but to safely utilize it in FinTech operations additional logic and safeguards must be employed. Such logic ensures that the AI assistant protects businesses and customers, and produces genuinely useful and appropriate responses.
To start, an array of ‘user message analyzer’ models scrutinize customers’ inputs. For example, a ‘sensitive information detector’ could detect if a customer is sharing their personal details in their message and a ‘sentiment analyzer’ could determine the customer’s mood. The analyzers enrich the customer’s input message with their results as metadata and forward it to the response generator which then produces the AI message using ChatGPT.
Say that a customer is searching for something within the FinTech’s website, a semantic search model would retrieve the content from a database according to the customer’s request. Meanwhile, a ranking model would order the search results according to relevancy.
A range of ‘AI message analyzer’ models subsequently analyze the AI output. For instance, a ‘compliance checker’ could ensure that the assistant’s response is legally sound while a ‘moderator model’ ensures political correctness – and only if all output analyzers approve, the message is displayed to the customer, otherwise regenerated.
While the assistant follows this logic, FinTechs can customize the AI to reflect their brand, messaging, and end goals. Prompt engineering enables FinTechs to give the model an identity by entering adjectives that inform the “personality” of the digital assistant (professional, friendly, concise). Similarly, FinTechs can give ChatGPT certain commands like ‘act as a facilitator for XYZ actions.’ and provide examples of good conversations to nudge the model to behave accordingly.
What are the risks and remedies of ChatGPT in FinTech?
Like any new tech integration, it’s important to acknowledge the risks and possible solutions associated with using ChatGPT, especially in FinTech when customers’ money and investment is at stake.
Depending on the data they’re trained on, AI language models can generate inappropriate or harmful responses. However, FinTechs can rely on OpenAI’s guardrails to prevent such responses. Alternatively, FinTechs can build their own moderation models to filter unwanted replies, as outlined above.
Poor recommendations and non-factual responses from ChatGPT can also be avoided by FinTechs regularly updating the databases that power ChatGPT. On top of that, FinTechs can improve factual correctness by explicitly instructing the model to admit if it doesn’t know how to respond to a prompt, as well as combine keyword-based searches with semantic search, plus create a ranking model to judge the pertinence of each result.
ChatGPT can have long response latencies depending on the prompt entered and the size of the database being searched. For FinTechs, this lag could cause a dip in customer satisfaction. To reduce latency, FinTechs can parallelize response generation, especially the analyzers mentioned above, and additionally build request-response databases where previous inputs and outputs are stored and are quickly retrievable.
FinTech is a highly regulated space, and businesses need to know that ChatGPT is an extension of their compliance efforts. FinTechs can lean on OpenAI’s guardrails and also construct their own compliance checker and sensitive information checker models to determine what information is shared and stored to prevent leaks.
ChatGPT can be expensive, particularly for accumulative API requests. OpenAI does offer different pricing tiers for FinTechs to choose from, or businesses can utilize smaller, more affordable language models for less complicated tasks.
Much like ChatGPT itself, the model’s role in FinTech is still evolving and will only become more sophisticated with time. FinTechs that introduce ChatGPT early will have the advantage of a more refined model that represents their brand well and is informed by a diverse range of previous customer communication. With this foundation, FinTechs will be able to leverage ChatGPT for tasks that fuel the next generation of finance.
Discover how intive can make your FinTech’s ChatGPT integration smooth and successful.
Speak to an expert now.