Blog Article

What do financial professionals need to know before using ChatGPT? The top potential ChatGPT Compliance Issues

Jul 20, 2023

Explore the top potential ChatGPT compliance issues for financial professionals in this latest blog from ComplySci.

Thinking about how AI or ChatGPT could help your firm cut overhead costs and save time? It might not be as simple as you think.

The Washington Post released an article on July 13, 2023 stating that the United States Federal Trade Commission (FTC) is opening an “expansive investigation into OpenAI.”

The investigation marks a response to what seems like an explosion in artificial intelligence (AI) tools available across services industries – including wealth management.

And while AI has the potential to streamline client communications, produce content and even aid in predictive analytics – it’s also a largely unexplored territory. At this point in time, there is a unique opportunity to enhance client services via AI; however, advisors currently lack regulatory guidance on AI best practices, leaving them vulnerable to compliance risks.

Today, we’re exploring common use cases for AI technologies like ChatGPT for financial professionals, as well as three major compliance concerns you should be aware of before diving into the AI pool.

Financial professionals and ChatGPT

While the birth of AI is often credited to Alan Turing’s work from the 1950’s, the ChatGPT AI model we’re all familiar with nowadays was released only seven months ago. Since then, the general sentiment from financial professionals seems to be two competing emotions: curiosity and concern – with curiosity winning the metaphorical race. 

Related: Artificial Intelligence and Next-Gen Compliance

A recent survey from SmartAsset shows that nearly 60% of advisors are interested in testing out the viral tool, while half of those respondents are already using it in some capacity.

The respondents that choose not to engage with ChatGPT site privacy concerns and compliance red tape as their driving factors for avoiding the site.

Use cases for ChatGPT in wealth management

Medium’s three-part series on “50 Ways to Use Chat GPT-3 for Finance” provides a comprehensive list of just how useful AI can be to financial professionals. From drafting blog content to learning how to communicate technical jargon to clients, the use cases appear to be plentiful.

Some of the less obvious suggestions from this list include:

  • Generating code
  • Analyzing a certain stock’s performance
  • Modeling data in Excel

It is important to understand that while ChatGPT can help with each of these tasks, it also has limitations. For example, the system has only collected information up to 2021 – anything more recent is not accessible.

Likewise, ChatGPT’s capabilities are limited by the user’s inputs. The results rely entirely on your inputs, and if you can’t communicate your orders clearly to the machine, you risk getting incorrect or incomplete results.

AI compliance concerns for financial professionals

There are three main areas of concern that financial professionals should have on their radar before employing ChatGPT or similar AI technologies in the workplace: data confidentiality, systems security and over-reliance on automation.

Information confidentiality

One of the main concerns for AI in wealth management is data privacy. Since artificial intelligence is predicated on intaking information for future reference, it may mean that any information you provide to the system is left vulnerable.

Something as simple as entering meeting notes into ChatGPT and asking it to organize the main points may save you time – but it could also mean that you’ve just divulged sensitive client information to a third party.

It’s expected that regulatory bodies like the SEC and FINRA will address these concerns more in the near future and provide further guidance to compliance programs. Until then, it is best that firms set clear policies for their staff on what is and isn’t appropriate uses.

Systems security

Like any third-party vendor software employed by your firm, it’s up to you to discern any potential security risks. With AI, your firm could be vulnerable to hackers who wish to use the technology maliciously.

We imagine that governing bodies will develop proper legislation that protects consumers against these risks, beginning with investigations such as the FTC’s referenced above.

Related: Mitigating reputational risks: Five best practices for compliance program management at investment firms

Additionally, verifying client identities could be more complicated as AI advances. Artificial intelligence can easily replicate voices and faces, making it more difficult for financial professionals to discern who they’re truly communicating with.

Over-reliance on automation

Automating processes with AI can save time and reduce error, but it also presents new challenges.

For one, AI could produce biased results through algorithms that prioritize company-set goals over client privacy. It begs the question: Can advice produced by AI be considered fiduciary?

Furthermore, AI’s ability to analyze data and predict trends can be useful, but advisors could become over-reliant on that data to the detriment of their clients. Financial professionals should be careful to have checks and balances in place rather than relying on AI solely to produce objective, logical results for their clients.

AI tools like ChatGPT could pave the way for lower costs and streamlined processes – but they don’t come without compliance risks. With these top compliance concerns in mind, your firm is better equipped to serve clients without flouting regulatory rules.

Learn more with ComplySci

Want to explore more ways to keep your firm compliant? Click here to download our free 2023 CCO Playbook or schedule a demo with a member of our team to get started.