Sam Altman Confronts Backlash Over GPT-5 Rollout, Unveils Expanding Vision During AMA

OpenAI Challenges Court Order to Preserve User Data in NYT Lawsuit



In brief

Altman admitted OpenAI mishandled the 4o-to-5 upgrade and pledged freer, more transparent tools.
“Adult mode” will relax content limits while protecting minors and users in crisis.
OpenAI’s $1.4 trillion build-out and new foundation will fund AI-driven science and resilience projects.

OpenAI chief executive Sam Altman faced users on Tuesday, answering questions in a live AMA that combined apologies with a sweeping blueprint for the company’s future.

During the AMA, Altman acknowledged missteps in how OpenAI handled the recent transition from GPT-4o to its latest model GPT-5 in August. He apologized for poor communication around safety filters, and pledged to give verified adults more control over what their AI can say.

“If this is going to be a platform that people everywhere can build on, use, and create with, we know they’ll have very different needs and desires,” Altman said. “There will, of course, be some broad limits, but we want users to have real control and customization over how they use it.”

Minergate

Altman’s contrition soon gave way to ambition, however. In the same session where he apologized for bungling the GPT-5 rollout, he sketched an ambitious transformation of OpenAI’s structure and scale that dwarfed the controversy.



The new OpenAI Foundation now controls the for-profit OpenAI Group and will channel roughly $130 billion in equity toward scientific and humanitarian projects. Altman also detailed OpenAI’s deepening alliance with Microsoft—extended through 2032 and valued at about $135 billion—cementing the two firms’ shared dominance over frontier models.

And behind it all looms a $1.4 trillion computing build-out: the so-called “Stargate” data-center network that Altman said will eventually churn out a gigawatt of AI compute every week.

From Apology to “Adult Mode”

During the AMA, Altman acknowledged that his earlier comments on content moderation had sparked confusion, admitting that using ‘erotica’ as an example to illustrate OpenAI’s stance on user freedom was a mistake.

In August, OpenAI said it will allow ChatGPT to generate erotic content for verified adults starting in December, shifting away from the company’s historically restrictive approach to sexual content.

“I thought there was a clear difference between erotica and porn bots,” he said. “In any case, the point we were trying to make is that people need flexibility, they want to use these things in different ways, and we want to treat adult users like adults in our own first-party services.”

He said the new “adult mode” would relax moderation limits for verified users while maintaining protections for minors and people in mental-health crises.

“As we build age verification in, and as we can differentiate users in crisis from users who are not, we want to give people more freedom,” he said. “That’s one of our platform principles.”

Toward an AI Researcher

Beyond policy, Altman described the company’s long-term research strategy: progressing from today’s large-language models toward an AI research assistant capable of reasoning and discovering new scientific knowledge while remaining safe and interpretable.

“We think it’s plausible that by 2026 models begin to make small discoveries,” he said. “By 2028, medium or maybe even larger ones.”

From ChatGPT to a Platform

Altman said OpenAI’s products were evolving from a single chatbot to a broader AI platform that others could build upon, pointing to users employing GPT-5 in areas like science, engineering, and design.

“You know you’ve built a platform when there’s more value created by people building on it than by the platform builder,” he said.

He also reaffirmed OpenAI’s belief in user privacy, acknowledging that people now share deeply personal information with AI systems.

“They’re talking to it like they would to their doctor, lawyer, or spouse,” he said. “That makes privacy protections—both technical and policy—especially important.”

Fixing the 4o-to-5 Upgrade

Altman also acknowledged that the recent 4o-to-5 upgrade had been rocky for some users, especially writers and creative professionals.

“We definitely learned things about the 4o-to-5 upgrade,” he said. “We’ll try to do much better in the future, both about continuity and about making sure the model gets better for most users, not just for people using AI for science or coding.”

During the AMA, Altman also addressed the future of earlier GPT models, saying that they would not be open-sourced—calling them too large and inefficient—but might be released “as museum artifacts.” He promised continued transparency around safety standards and said future AMA sessions would be part of a broader effort to communicate “how and why” OpenAI’s systems behave as they do.

New Structure for OpenAI

The AMA coincided with the unveiling of a new organizational structure and dual governance model. The OpenAI Foundation, a nonprofit, now controls the for-profit OpenAI Group, a public benefit corporation. The foundation holds about 26% of the company’s equity—worth roughly $130 billion—and will fund projects that use AI for the public good.

OpenAI also announced an expanded long-term partnership with Microsoft. The new agreement gave Microsoft a 27% stake in OpenAI Group, valued at around $135 billion, and extended its exclusive rights to OpenAI’s frontier models through 2032.

The deal allows Microsoft to pursue AGI research independently, while OpenAI can release select open-weight models and partner with outside developers. OpenAI is also committed to purchasing $250 billion in additional Azure cloud services.

$1.4 Trillion Infrastructure Build-Out

During the livestream, Altman also detailed the massive infrastructure projects, including committing more than $1.4 trillion in financial obligations toward a 30-gigawatt computing build-out, including its first “Stargate” data-center complex in Abilene, Texas.

He said the company eventually hopes to build an “infrastructure factory” capable of producing one gigawatt of compute per week.

“Our goal is to build what we call an infrastructure factory—able to turn out about a gigawatt of compute every week,” Altman said. “We want to drive the cost down to roughly $20 billion per gigawatt over a five-year cycle. That’s going to take major innovation, deep partnerships, and a lot of revenue growth.”

Generally Intelligent Newsletter

A weekly AI journey narrated by Gen, a generative AI model.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Pin It on Pinterest