OpenAI’s governance structure and the role of its board in determining the achievement of Artificial General Intelligence (AGI) has raised questions about the future of its partnership with Microsoft and the broader impact on the world. The organization’s complex nonprofit-capped profit structure and the potential conflict between for-profit and nonprofit interests have come under scrutiny.
OpenAI’s definition of AGI
OpenAI, a leading player in the field of artificial intelligence, has set a defining goal to create AGI, which it describes as “a highly autonomous system that outperforms humans at most economically valuable work.” The achievement of AGI is a pivotal milestone that could reshape industries and societies worldwide. However, the ambiguous and evolving nature of AGI makes it challenging to establish a clear criteria for its attainment.
OpenAI operates under a unique governance structure, where a nonprofit board of directors, consisting of six members, holds the authority to determine when AGI has been reached. This board includes OpenAI executives, such as CEO Sam Altman, as well as external figures with expertise in AI and policy. The board’s composition has drawn attention due to potential conflicts of interest and the influence of the Effective Altruism movement.
Microsoft’s investment and stake in OpenAI
Microsoft is a significant investor in OpenAI and has a vested interest in the organization’s success in achieving AGI. Despite being a for-profit entity, OpenAI Global, LLC, is fully controlled by the nonprofit OpenAI and is legally bound to pursue the nonprofit’s mission. Microsoft’s willingness to support OpenAI’s mission to provide safe and beneficial AGI played a crucial role in their partnership.
The role of OpenAI’s board in determining AGI has raised questions about the future of Microsoft’s involvement. Once AGI is achieved, OpenAI’s commercial agreements, including intellectual property licenses, will no longer apply to post-AGI technology. Microsoft’s substantial investment may face uncertainty if OpenAI’s board decides to prioritize nonprofit interests over for-profit ones.
Legal experts have weighed in on the unusual nature of OpenAI’s board making operational decisions about AGI. While there may be no legal impediment, the potential for conflicts between nonprofit and for-profit interests remains a concern. Some argue that corporate law generally imposes a duty on directors to oversee mission-critical issues, which may justify OpenAI’s approach.
Diversity and perspective
Critics argue that OpenAI’s focus on AGI has overshadowed the current impacts of AI technologies and tools. Concerns have been raised about the diversity of perspectives within OpenAI and its board. However, the debate about board composition may distract from more fundamental questions about the legitimacy of OpenAI’s mission and claims regarding AGI.
OpenAI’s shifting definition of AGI, where it suggests that the first AGI will be just one point along the continuum of intelligence, adds to the complexity of the situation. CEO Sam Altman envisions a future with multiple AGIs to avoid concentration of power. These evolving definitions raise questions about the practical implications for Microsoft’s partnership.