top of page
  • Writer's pictureRita Rhodes

Artificial Intelligence for Investment Advisers

Updated: Dec 31, 2023

Friendly looking robot

 Artificial Intelligence. It's a hot topic that seems to be on everyone's tongues these days. President Biden recently issued an Executive Order requiring federal agencies to use AI safely and responsibly, and encouraging private sector entities to do the same. The EO focuses on protection of privacy, intellectual property, and national security. Back in July, the SEC issued a proposed rule on how broker-dealers and investment advisers can use AI (not finalized or adopted yet). The regulators' concerns are primarily focused on how financial service entities may use AI in generating advice. Mike Kitces recently authored a Nerd's Eye View article about how advisers might use ChatGPT in their practices.

 

As with any new technology, the initial reactions range between fear of the unknown and excitement of how it will make our lives better. Depending on how old you are, you can probably recall when social media, email, or even the Internet first burst onto the scene. There were similar questions, fear, and excitement. But now these things have become part of our daily lives, and are taken for granted.

 

And as with any new technology, the regulators will grapple with what AI means to the industry, and what parameters they will put around how it can be used, so that investors and financial markets are protected. Expect regulations to be coming at some point, but with regulators, it can be a slow roll.

 

I've been receiving questions from advisers about the use of AI in their businesses. For now, my compliance advice is that it makes sense to test it out to see how it might be used in your practice for non-advice related tasks. I don't think we know enough about it yet -- or more importantly, what restrictions regulators might place on using it -- to rely on AI to pick investments or build portfolios. But I do see the value in using AI to help increase your operational efficiencies (such as taking notes in meetings or automating tasks), or to help develop creative content (like marketing content, or client communications).

 

But keep in mind that you will still need to review any AI output to be sure it is complete, and accurate. In that regard, you should have sufficient expertise to know whether AI is correct. You could ask ChatGPT to write a blog post for you about a complex financial topic -- for example, Roth backdoor conversions -- but if you aren't confident about the rules and tax implications yourself, you can't be 100% certain that the blog output is accurate. You should probably verify facts against another resource if you aren't certain.

 

If using AI for notetaking, you'll still need to review the output and correct it as necessary, ideally while the meeting is still fresh in your mind. And if using it for automating tasks, you'll need to periodically verify that all the steps are functioning correctly. 

 

Ultimately, you are responsible for the advisory services, information, and content you provide, regardless of whether you are providing it directly or through a service provider, software, web application, or AI. So, as with any tool you use, make sure it's working right for you.

 

If you decide to allow AI to be used in your practice, I recommend that you include some guidelines in your compliance manual on how it can be used. I suggest taking the approach of implementing its use in phases over time, to assess how it is being used, to see how effective and accurate it is, to consider how clients may receive it, and to give the industry time to catch up. I'll be working on some suggested policy language for the compliance manual, so let me know if you're interested in seeing that.

10 views0 comments
bottom of page