Yesterday, in remarks at Yale Law School, SEC Chair Gary Gensler talked about the opportunities and challenges of AI. According to Gensler, while AI "opens up tremendous opportunities for humanity," it "also raises a host of issues that aren't new but are accentuated by it. First, AI models' decisions and outcomes are often unexplainable. Second, AI also may make biased decisions because the outcomes of its algorithms may be based on data reflecting historical biases. Third, the ability of these predictive models to predict doesn't mean they are always accurate. If you've used it to draft a paper or find citations, beware, because it can hallucinate." In his remarks, Gensler also addressed the potential for systemic risk and fraud. But, in the end, he struck a more positive note, concluding that the role of the SEC involves both "allowing for issuers and investors to benefit from the great potential of AI while also ensuring that we guard against the inherent risks."

In terms of macro issues, one concern he raised was about AI and systemic risk. That risk might arise if just a few platforms ultimately dominate the field, resulting in only "a handful of base models upstream." That, he said, "would promote both herding and network interconnectedness. Individual actors may make similar decisions as they get a similar signal from a base model or rely on a data aggregator. Such network interconnectedness and monocultures are the classic problems that lead to systemic risk." Current guidance for risk management will need to be reimagined; the "challenges to financial stability that AI may pose in the future will require new thinking on system-wide or macro-prudential policy interventions."

On a more micro level, Gensler addressed fraud and AI washing. Citing a paper by former SEC Commissioner Kara Stein, he observed that AI can result in programmable harm, predictable harm and unpredictable harm. Programmable harm involves intent and, therefore, potential liability is fairly straightforward. Predictable harm involves a "reckless or knowing disregard of the foreseeable risks of your actions" in deploying a particular AI model. Did the actor act reasonably to prevent its AI model from illegal actions, such as front-running, spoofing or giving conflicted investment advice? Were there appropriate guardrails in place? Were the guardrails tested and monitored? Did the guardrails take into account the possibility that the AI model may be "learning and changing on its own," or may "hallucinate" or "strategically deceive users"? With regard to potential liability for truly unpredictable harm, he said, that will play out in the courts. Quoting the first SEC Chair Joseph Kennedy, he said: "The Commission will make war without quarter on any who sell securities by fraud or misrepresentation."

Whenever there is a buzzy new technology, there are also often false claims about its use. But if "a company is raising money from the public," Gensler cautioned, the company "needs to be truthful about its use of AI and associated risk." He noted that "[s]ome in Congress have proposed imposing strict liability on the use of AI models." He continued:

"As AI disclosures by SEC registrants increase, the basics of good securities lawyering still apply. Claims about prospects should have a reasonable basis, and investors should be told that basis. When disclosing material risks about AI—and a company may face multiple risks, including operational, legal, and competitive—investors benefit from disclosures particularized to the company, not from boilerplate language. Companies should ask themselves some basic questions, such as: 'If we are discussing AI in earnings calls or having extensive discussions with the board, is it potentially material?' These disclosure considerations may require companies to define for investors what they mean when referring to AI. For instance, how and where is it being used in the company? Is it being developed by the issuer or supplied by others? Investment advisers or broker-dealers also should not mislead the public by saying they are using an AI model when they are not, nor say they are using an AI model in a particular way but not do so. Such AI washing, whether it's by companies raising money or financial intermediaries, such as investment advisers and broker-dealers, may violate the securities laws."

Gensler also discussed the potential for AI to "hallucinate," as well as the potential for it to build in biases and conflicts of interest.

SideBar

AI washing is a concern that Gensler has raised previously. In remarks at an event earlier this year hosted by Public Citizen, reported by law.com, Gensler cautioned companies to be careful about making false claims about their use of AI: "Companies should be 'truthful about what they're saying about their claims'....They should also 'talk about the risks' of using AI 'and how they manage their risk,'" Gensler said. And in remarks in July 2023 to the National Press Club, Gensler also spoke about the growing capacity of AI to make predictions about individuals, with outcomes that may be "inherently challenging to interpret." These predictions could be problematic if they are based on incorrect information, or "on data reflecting historical biases" or "latent features that may inadvertently be proxies for protected characteristics." Or, the AI system could create conflicts of interest by optimizing the interests of the platform of, say, the broker or financial adviser over the interests of the customer. Other issues discussed were the potential for fraud, the expectation that a small number of AI platforms will dominate the field through economies of scale and data networks, and second, the possibility of financial instability resulting from "herding" behavior. (See this PubCo post.)

The moderator also opened up the discussion for questions from the audience. Of course, there were a couple of questions about the SEC's climate disclosure proposal. One audience member asked about the SEC's role with regard to climate in light of the rules enacted by the EU, California and other jurisdictions? To another audience member, in one reality, the SEC certainly has a role in crafting climate disclosure requirements under the rule of law, but does the staff take into account the "other reality" that its proposed climate disclosure rules—and ESG rules in general—have become a flashpoint in the culture wars? Does the staff think about the "noise"? Gensler responded that the SEC is certainly not a climate regulator. But, given that, by 2022, a huge proportion of companies already provided some climate disclosure, including many that provided GHG data, the SEC has a role in ensuring consistency and comparability. Gensler observed that, of 34 rules the SEC has adopted so far, six have been challenged in court, and 28 have not. He viewed these legal challenges as very important aspects of democracy. Sustainable rulemaking was also important, and a bad loss can be harmful. In his view, the SEC was taking appropriate actions within the law, but as courts shift their interpretations, it could be more of challenge. Where will the courts be in 2025 or 2026? They were certainly looking at the different circuits. In the Fifth Circuit, for example, although that court recently vacated the SEC's rules for disclosure regarding company stock repurchases, the court did not agree that the rules violated the First Amendment. Upholding the rule against the First Amendment challenge was legally very important in Gensler's view. (It's worth noting here that the Chamber of Commerce has already challenged California's climate rules on the basis of the First Amendment. See this PubCo post.)

SideBar

In challenging the SEC's buyback disclosure rules, the Chamber of Commerce succeeded in its argument that the rule had violated the Administrative Procedure Act, but it failed in claiming that the rule violated the First Amendment by compelling companies to disclose the rationale for their stock buybacks. In the context of compelled commercial speech, where the issue involves not a restriction on speech, but rather an affirmative obligation to disclose, the court determined that the more lenient standard set forth in Zauderer v. Office of Disciplinary Counsel (1985) was applicable. Under Zauderer, regulations may impose a requirement to disclose purely "factual and non-controversial" information, so long as those disclosures are "reasonably related to a legitimate state interest" and not "unjustified or unduly burdensome."

Petitioners claimed that, by its very nature, an "issuer's subjective opinion about the business benefits of its actions cannot be a purely factual disclosure." Citing NetChoice, L.L.C. v. Paxton (5th Cir. 2022), the court concluded that a requirement to "explain the reason" for a company's actions—i.e., the rationale-disclosure requirement—is still a purely factual disclosure. Petitioners also argued that share repurchase decisions are "one of the most controversial corporate decisions an issuer can make." But the court decided otherwise: "Petitioners, in essence, invite us to hold that the reasons behind a share repurchase are...more controversial than the reasons behind social media censorship,...." the mandated disclosure at issue in NetChoice. The court "decline[d] that invitation."

The court also found that the SEC had satisfied its burden of showing that "the rationale-disclosure requirement is neither unjustified nor unduly burdensome." According to the court, the SEC has a "legitimate interest in promoting the free flow of commercial information," and, because the stated purpose of the rationale-disclosure requirement is "to allow investors to separate out and assess the different motivations behind, and impacts of, share repurchases," the "rationale-disclosure requirement is reasonably related to that interest." In addition, the rationale-disclosure requirement did not burden issuers' protected speech or drown out their messages. Accordingly, the court found that the requirement satisfied Zauderer and was constitutional. (See this PubCo post.)

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.