News in English

Read the letter OpenAI whistleblowers sent to the SEC calling for action on NDAs

OpenAI whistleblowers wrote to the Securities and Exchange Commission seeking an investigation into whether the ChatGPT maker violated SEC rules.

OpenAI logo displayed on an iPhone with a blue background
OpenAI whistleblowers wrote to the Securities and Exchange Commission.
  • OpenAI whistleblowers urged the SEC to investigate the ChatGPT maker for potential rule violations.
  • The whistleblowers claim OpenAI used nondisclosure agreements (NDAs) to silence employees.
  • The letter was sent to SEC chair Gary Gensler and Sen. Chuck Grassley's office.

OpenAI whistleblowers are calling on the Securities and Exchange Commission to investigate whether the ChatGPT maker violated SEC rules and prevented employees from speaking out.

Legally protected whistleblowers sent a letter to Gary Gensler, chair of the SEC, on July 1 calling on the regulator to investigate OpenAI. The letter, which was also sent to Sen. Chuck Grassley's office, was later shared with Business Insider.

The letter states that the whistleblowers provided documents to the SEC supporting their claims that OpenAI's NDAs "violated numerous precedents of the SEC."

Sen. Grassley said in a statement shared with Business Insider that assessing the threats posed by AI fell under Congress's constitutional responsibility to protect national security.

He added: "OpenAI's policies and practices appear to cast a chilling effect on whistleblowers' right to speak up and receive due compensation for their protected disclosures. In order for the federal government to stay one step ahead of artificial intelligence, OpenAI's nondisclosure agreements must change."

OpenAI didn't respond to a request for comment from BI. An SEC representative said: "The SEC does not comment on the existence or nonexistence of a possible whistleblower submission."

The whistleblowers' complaint comes after Vox reported in May that OpenAI could take back vested equity from departing employees if they did not sign non-disparagement agreements.

Sam Altman said on X shortly after the report was published that he "did not know this was happening."

Nine former and current OpenAI employees signed an open letter in June calling on major AI firms to ensure greater transparency and better protections for whistleblowers.

William Saunders, a former OpenAI employee who quit earlier this year after losing confidence that the company could responsibly mitigate AI risks, previously told BI about what led to him signing the letter and speaking out.

He said an incident in which another former OpenAI employee, Leopold Aschenbrenner, was fired and the requirement that OpenAI staff sign NDAs led to the four principles set out in the June open letter.

Read the full letter sent to the SEC:

The Honorable Gary Gensler
Chair, Securities and Exchange Commission
100 FStreet, NE Washington, DC 20549
July 1, 2024
Re: OpenAI Violations of Rule 21F-17(a) and Implementation of E.O. 14110
Dear Chair Gensler:
We represent the one or more anonymous and confidential whistleblowers) who filed a formal TCR complaint with the Securities and Exchange Commission ("SEC") documenting systemic violations of the Dodd-Frank Act, 15 U.S.C. § 784-6 and SEC Rule 21F-17(a) committed by OpenAI. OpenAl is a San Francisco based tech company most well-known for its artificial intelligence ("AI") product ChatGPT? Under SEC precedent, and as a mater of law, OpenAI is required to comply with the SEC's regulation prohibiting illegally restrictive non-disclosure agreements ("NDAS")
As explained in the complaint, OpenAI's employment, severance, non-disparagement, and non- disclosure agreements violated SEC Rule 21F-17(a). The agreements prohibited and discouraged both employees and investors from communicating with the SEC concerning securities violations, forced employees to waive their rights ot whistleblower incentives and compensation, and required employees to notify the company of communication with government regulators. The SEC has made it abundantly clear that privately held companies that engage in these practices violate the law and are subject to fines and other enforcement actions.
Given the risks associated with the advancement of AI, there si an urgent need to ensure that employees working on this technology understand that they can raise complaints or address concerns to federal regulatory or law enforcement authorities. Likewise, it is critical for companies like OpenAI to understand the illegal nature of their NDAs, and to ensure that their workplace
The SEC must take swift and aggressive steps to enforce SEC Rule 21F-17(a) within the AI sector, and to ensure that there have been no violations of 18 U.S.C. § 1513(e). Executive Order 14110 requires nothing less, acknowledging that every agency of the federal government is responsible for "mitigating" the "substantial risks" posed by AI.
The Executive Order warns that "Artificial intelligence (AI) holds extraordinary potential for . . . peril," and the "irresponsible use" of this emerging technology "could exacerbate societal harms such as fraud, discrimination, bias, and disinformation; displace and disempower workers; stifle competition; and pose risks to national security." The Executive Order therefore concludes that ensuring the safe development of AI technology "demands a society-wide effort that includes government, the private sector, academia, and civil society."
To achieve this end, the Order mandates that agencies such as the SEC enforce existing laws designed to protect the public and investors from fraud.9 At the heart of any such enforcement effort is the recognition that insiders (i.e. whistleblowers) must be free to report concerns to federal authorities. Moreover, these employees need to be aware of their rights under the Dodd-Frank Act to file such reports confidentially and anonymously directly with the SEC. They also need to know that they cannot be retaliated against for making such reports, and that they are potentially eligible for compensation if their reports result in successful enforcement actions designed to protect the public and investors. Employees are in the best position to detect and warn against the types of dangers referenced in the Executive Order and are also in the best position to help ensure that AI benefits humanity, instead of having an opposite effect.
The SEC's Whistleblower Office was provided with significant documentation demonstrating that OpenAI's prior NDAs violated the law by requiring its employees to sign illegally restrictive contracts to obtain employment, severance payments, and other financial consideration. Given the well-documented potential risks posed by the irresponsible deployment of AI, we urge the Commissioners to immediately approve an investigation into OpenAI's prior NDAs, and to review current efforts apparently being undertaken by the company to ensure full compliance with SEC Rule 21F-17(a).
This request for an investigation is fully supported by the documents provided to the SEC by the Whistleblower(s). The agreements attached as exhibits to the SEC complaint support a finding that OpenAI's use of the NDAs submitted with the complaint violated numerous precedents of the SEC.
SEC precedent requires that an effective enforcement action be undertaken based on the NDAs provided as evidence in the Dodd-Frank complaint. In the SEC's first case addressing the issue of improper NDAs, the Commission sanctioned KBR for an NDA drafted before the Dodd-Frank Act was even passed. The company was sanctioned despite agreeing to fix the language in the NDAs, and despite the agreeing to contact employees who had executed these agreements in the past and informing them directly of their right to report wrongdoing to the appropriate authorities.
Additionally, given the large number of improper NDAs used by OpenAI over a long period of time, it is imperative that the Commission ensure that all prior improper NDAs be cured, and that any corrective action taken by OpenAI is consistent with past Commission precedent.
The courage of our client(s) in coming forward creates an opportunity to help ensure that all participants in creating and marketing this new technology will firmly understand that employees and investors always have the right to report wrongdoing, safety issues, and violations of law to the appropriate authorities. The chilling effect of prior NDAs and the harmful message these illegal contracts create within the workplace culture needs to be addressed in an appropriate enforcement action, designed to fully address any harmful impact caused by these practices.
Accountably is at the heart of deterrence, and deterrence is at the heart of the Dodd-Frank Act.
Among the violations documented by the Whistleblower(s) are:
• Non-disparagement clauses that failed to exempt disclosures of securities violations to the SEC;
• Requiring prior consent from the company to disclose confidential information to federal authorities;
• Confidentiality requirements with respect to agreements, that themselves contain securities violations;
• Requiring employees to waive compensation that was intended by Congress to incentivize reporting and provide financial relief to whistleblowers
As we expressed above, even if OpenAI is making reforms in light of the public disclosures of their illegal contracts, the importance of taking appropriate enforcement action is critical – not as an attack on OpenAI or to hinder the advancement of AI technology, but to send the message to others in the AI space, and to the tech industry at large, that violations on the right of employees or investors to report wrongdoing will not be tolerated. The door must be open for potential whistleblowers both at OpenAI and at other companies to come forward concerning misconduct and safety issues possibly occurring throughout the field. The law requires that such complaints be welcomed and rewarded as a matter of law and policy, not discouraged by companies sending direct or indirect messages to employees that they must honor a "code of silence" that has resulted in so many disasters in the past.
As the Senate Judiciary Committee pointed out in its report on the Sarbanes-Oxley Act, the SEC- enforced whistleblower laws are intended to specifically target and eliminate the corporate culture that inhibits lawful disclosure to law enforcement or regulatory authorities:
[The] "corporate code of silence" not only hampers investigations, but also creates a climate where ongoing wrongdoing can occur with virtual impunity. The consequences of this corporate code of silence for investors in publicly traded companies, in particular, and for the stock market, in general, are serious and adverse, and they must be remedied.
SEC action here is perhaps the best way for development of this rapidly evolving and important industry to proceed in a safe, transparent manner.
Given the potential that advanced AI could "pose an existential risk to humanity," restrictive nondisclosure agreements are particularly egregious. We therefore request that the SEC take the following actions to quickly and effectively reinforce to OpenAI and all of their employees or investors – as well as employees of other companies in this space – that they have a right to file claims with the SEC and other federal or state law enforcement or regulatory authorities:
1. Require OpenAl to produce for inspection every employment agreement, severance agreement, investor agreement, or any other contract that contains a nondisclosure agreement. Upon review of these agreements, the SEC can ensure that none of the employees or other persons who signed these agreements suffered any harm that is explicitly prohibited under the Sarbanes-Oxley Act's obstruction of justice provision, 18 U.S.C. § 1513(e).
2. Require OpenAl to notify al past and current employees as to the violations they
committed, notify every past and current employee that pursuant to the Dodd-Frank Act employees have the right to confidentially and anonymously report any violations of law ot the SEC, and inform them of all the rights associated with such a report.
3. Fine OpenAl for each improper agreement under the Securities and Exchange Act to the extent the SEC deems appropriate.
4. Direct OpenAI to cure the "chilling effect" of its past practices consistent with the affirmative relief in prior Commission decisions.
Thank you for your time and consideration. We remain available to assist the government on this matter in any way going forward.
Read the original article on Business Insider

Читайте на 123ru.net