OpenAI is launching an ‘independent’ safety board that can stop its model releases

OpenAI is converting its safety committee into an independent “board oversight committee” with the power to delay model launches for safety reasons, according to an OpenAI blog post. The committee recommended the creation of an independent committee following a recent 90-day review of OpenAI's “safety-related processes and safeguards.”

The committee, chaired by Zico Kolter and also including Adam D'Angelo, Paul Nakasone and Nicole Seligman, will be “informed by senior management of security assessments for key model releases and, together with the full board, will exercise oversight of model releases, including the authority to delay a release until security concerns are addressed,” OpenAI said. OpenAI's full board will also receive “regular briefings” on “security and protection issues.”

The members of OpenAI's security committee are also members of the company's extended board of directors, so it's unclear how independent the committee actually is or how that independence is structured. (CEO Sam Altman was previously a member of the committee, but is no longer.) We've reached out to OpenAI for comment.

By establishing an independent safety board, OpenAI appears to be taking a similar approach to Meta's Oversight Board, which will review some of Meta's content policy decisions and may make decisions that Meta must abide by. None of the Oversight Board members are on Meta's board of directors.

The review by OpenAI's Security Committee has also “created additional opportunities for collaboration and information sharing within the industry to improve the security of the AI ​​industry.” The company also says it will look for “more ways to share and explain our security work” and “more opportunities for independent testing of our systems.”

Update, September 16: Added that Sam Altman is no longer on the committee.

Leave a Comment

url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url