AI and Data Analytics – new SEC rules in 2024

At end of July, the SEC approved a plan that they say will root out conflicts of interest that can arise when financial firms use Artificial Intelligence (AI) to serve clients. They are also improving rules requiring companies to disclose serious cybersecurity incidents within four business days of any significant breach.

I would be more impressed if these were required for all technology firms that handle customer data analytics (not only financial firms).

The SEC asserts that the new regulations will ensure that ‘predictive data analytics is used to optimize services that better serve clients’ and not for the benefit of the financial firm. Banks and brokerage firms are typically using AI for fraud detection and market surveillance, but recently the shift has been made to have AI and analytics as part of trading recommendation, asset management, and lending. This is a huge development with serious implications for consumers. The goal of the new regulation is to ensure that biases are not ingrained in the technology algorithm, particularly since many vendors and consumers accept technology output, as fact, without human verification.

In this vein, The Federal Trade Commission (FTC) has opened an investigation into Microsoft Corp – OpenAI Inc (the creator of ChatGPT) to examine what risks the chatbot poses for consumers . . . these programs are written by humans and can extend biases and discrimination.

The ideal ‘responsible innovation’ in technology is appealing but so is responsible capitalism or governance and we are currently not doing well in any of these areas.

AI has the potential to draw on reams of data to target individual investors and nudge them to alter their behavior on trading, investing, borrowing, or even opening financial accounts for them. Many of the new tools can be transformative in our time, and I would love to use them. Even so, we should be leery about the concentration of this technology and powerful data in the hands of only a few firms which can pose a huge risk for future stability in financial markets.

It is important that we not provide our private data to technology or analytics software that is not yet fully tested and regulated from unregulated companies. We need to continue to demand that regulations be developed to ensure the safety of our data and particularly add controls for how for-profit firms can use our data. I am particularly concerned when I see errors in financial software output that are accepted as correct because they are software generated.

Edi Alvarez, CFP®
BS, BEd, MS

www.aikapa.com