Search
Close this search box.

Artificial intelligence  |  Crisis management  |  Governance

Lessons in governance from OpenAI

SHARE
LinkedIn
Twitter
Facebook
Email

Last week offered an unusual real-life and real-time governance case study with the removal and reinstatement of Sam Altman, co-founder and CEO of OpenAI.

From a government perspective, a few aspects are particularly worth noting:

  • OpenAI has a unique corporate structure. OpenAI Nonprofit was created in late 2015 with the goal of building safe and beneficial artificial general intelligence for the benefit of humanity. In 2019, a “capped profit” subsidiary was established to raise capital and hire the talents needed to achieve the mission.

“Our mission is to ensure that artificial general intelligence—AI systems that are generally smarter than humans—benefits all of humanity.”

https://openai.com/about

  • OpenAI has a unique governance structure. The board of directors sits on the not-for-profit parent board and governs the operations of the entire OpenAI structure. It is not tasked with maximising shareholder value.

“The board is still the board of a Nonprofit, each director must perform their fiduciary duties in furtherance of its mission—safe AGI that is broadly beneficial. While the for-profit subsidiary is permitted to make and distribute profit, it is subject to this mission. The Nonprofit’s principal beneficiary is humanity, not OpenAI investors”.

https://openai.com/our-structure

At the time of the removal of Sam Altman on 17th November, the board was majority independent with directors not holding equity in OpenAI.

  • The rationale for removing Sam Altman has not clearly emerged. Reports indicate that the CEO has not been “consistently candid in his communications with the board”, causing them to lose confidence in his leadership. The board members never gave specific details behind their decision; but one has to assume that the decision was made so that board members could fulfil their duties.

What can we learn?

Stakeholder engagement. The board did not anticipate the threat of a mass exodus that could have ended OpenAI. The potential resignations of so may employees underscore the importance of stakeholder engagement at the time of change.  

Board size. The size of the board of OpenAI was relatively small with 4 members. Given the complexity of OpenAI’s mission and unique governance structure, a larger board may have better supported effective decision-making and stakeholder management.

Boardroom composition. In addition to size, the board did not include traditional governance experts, who could have contributed their knowledge, skills and experience in audit, finance, risk management, compliance, etc.

At the time of writing this post, an interim board has been announced; it will be tasked with the nomination of a 9-member board. One will expect the addition of skills other than tech, which dominated the make-up of the board.

Alignment in the boardroom. Though we do not know precisely, a theory is that the for-profit efforts of OpenAI had become at odds with the mission of the not-for-profit board, creating tension within the boardroom.

Communication skills. Board members must have strong communication skills. And when an organisation is under the scrutiny that OpenAI receives, media training for boards is a must. While we don’t know the specifics, it seems that there were breakdowns in communications across many levels.


Related resources:

Crisis management & communication

Crisis management and employee/workforce engagement


Written by Elise Perraud, NEDonBoard COO, board member and non-executive director

Events