December 8, 2025
In September, the Securities and Exchange Commission (SEC) withdrew 14 proposed rules dating from the Biden administration. The announcement represented a significant shift in the agency’s regulatory approach to the financial sector under President Trump’s SEC Chair Paul Atkins. Specifically, withdrawing a rule that required investment advisors to “eliminate or neutralize” conflicts of interest arising from their use of artificial intelligence highlights major differences between the Biden and Trump administrations’ assessments of the threat posed by predictive analytical technologies and how best to regulate this rapidly advancing area.
Under the Biden administration, then SEC Chair Gary Gensler rang the alarm bell about the potential impact of artificial intelligence on the financial markets. Prior to becoming SEC Chair, Gensler was a professor at MIT and co-authored a research paper arguing that uniform data and model designs would result in financial market risks. He explained “models built on the same datasets are likely to generate highly correlated predictions that proceed in causing crowding and herding” leading to systematic risks that could unleash a financial crisis. As an example, Gensler cited the 2008 financial crisis, where systemic risk created by the financial sector’s heavy reliance on three major credit agencies to regulate collateral obligations contributed to a global crash. In an August 2023 interview with The New York Times, then-Chair Gensler predicted that just a few AI companies would create financial models undergirding the economic system, setting up global markets for a financial crash. “This technology will be the center of future crises, future financial crises,” Gensler said. “It has to do with this powerful set of economics around scale and networks.”
Not surprisingly, the SEC under Gensler proposed in July 2023 a rule that would have prevented broker-dealers or investment advisors from using AI that resulted in investment firms placing their own interests ahead of investors. Under the proposed rule, investment firms were also required to adopt and maintain written policies and procedures that would prevent such violation of the policy. In addition, the firm would have to comply with certain record-keeping requirements.
The proposal was criticized for several reasons. First, opponents argued the rules would place a serious compliance burden on investment firms. Second, opponents argued that such a proposal hurt the development of new technologies and innovation. Finally, opponents argued the definition of “covered technology” (as applied to AI) was too broad. For example, “covered technology” would include regulating a simple technology like Excel spreadsheets.
The Trump administration and SEC Chair Atkins have taken positions diametrically opposed to Gensler’s, focusing more on enabling innovation than on enforcement. In July, President Trump announced an “AI Action Plan” that described regulation as a barrier to AI innovation. In August, Chair Atkins announced the creation of an AI Task Force consistent with the administration’s approach. The announcement said that the Task Force would “remove barriers to progress” and “focus on AI applications that maximize benefits.” Furthermore, the SEC under Atkins is examining whether investment firms possess the proper governance procedures to monitor AI technologies as opposed to eliminating any conflicts of interests associated with new technologies.
Even those who are proponents of AI’s capabilities believe that AI poses a real threat to financial stability and continue to sound the alarm. In July, OpenAI CEO and ChatGPT co-creator Sam Altman warned about a “significant, impending fraud crisis brought about by artificial intelligence.” The effectiveness of the new administration’s focus on proper disclosure of investment firms’ AI use and governance—as opposed to eliminating conflicts on new technologies—can only be determined over the course of time.