OpenAI Co-Founder & CEO Sam Altman speaks onstage during TechCrunch Disrupt San Francisco 2019 at Moscone Convention Center on October 03, 2019, in San Francisco. (Source: Wikipedia photo by Steve Jennings/Getty Images for TechCrunch)

AI Experts Detail OpenAI Power Struggle, Frame It as Accel/Decel Debate

A recent column describing the leadership changes at OpenAI, with Sam Altman being reinstated as CEO and the board being replaced, explores the debate between “accelerationists” and “decelerationists” regarding the development and regulation of AI. The story on begins:

“When you have them by the stock options, their hearts and minds will follow.” This, it seems, is the moral of the OpenAI leadership story, the full truth of which remains shrouded in non-disclosure agreements and corporate confidentiality, dimly illuminated by unattributed comments.

Sean Welsh, co-author of “An Introduction to Ethics in Robotics and AI” and author of “Ethics and Security Automata,” along with Michael T. Bennett, award-winning AGI researcher and former Eigen COO, outline the incidents reported during the OpenAI debacle that played out in November and how the company is now squarely in the “acceleration” camp. The story explains it thus:

“It took Arya Stark five seasons of Game of Thrones to get her vengeance. It took Sam Altman five days to crush his enemies, see them driven before him, and retweet the lamentations of the decels.”

The article explores potential reasons for Altman’s initial sacking, concluding that they remain unclear, with speculations ranging from personality clashes to conflicts of interest. The development of Q* (Q star), however, is the more likely possibility.

“Some suggest the coup was triggered by Q* (Q star), a recent leaked OpenAI breakthrough that enables AI to do primary school math. The more lurid accounts suggest this heralds the end of the world. Other scuttlebutt suggests OpenAI’s recent Dev Day, in which they unveiled new products that would make it easier for developers to ship AI products based on ChatGPT, was the trigger. This would make it easier for rogue developers to use AI to do dirty deeds, such as make biological weapons. Advocates of the accel/decel explanation note that two of the departed board members (Toner and MacCauley) have decelerationist links.”

The writers emphasize the presence of competing firms in the race for Artificial General Intelligence and the importance of maintaining a technological edge in open societies. The article concludes by rejecting doomsday scenarios and envisions a future of cooperative competition among AGI firms. It’s a fascinating overview worth reviewing.