Back to NewsRead Original
AI News CN (Telegram) - English Translation

OpenAI reveals the training of GPT-4.5: 100,000 GPUs, almost all employees involved, and there were "catastrophic problems"

OpenAI Reveals GPT-4.5 Training: 100,000 GPUs, Nearly All Hands on Deck, and "Catastrophic Issues" Emerged

OpenAI recently disclosed some details about the R & D of GPT-4.5. The training took two years, utilized 100,000 GPUs, and almost all employees were involved. During the process, problems such as infrastructure failures and hidden bugs were encountered, and the team had to "repair while training".

The model's performance is approximately 10 times better than that of GPT-4, and the intelligence enhancement effect exceeds expectations. OpenAI has found that the key to future breakthroughs lies in data efficiency rather than computing power. The system architecture is shifting towards multiple clusters, and in the future, collaboration of tens of millions of GPUs may be required.

The team also shared the relationship between the data long - tail effect and the Scaling Law, as well as the experience of co - designing algorithms and systems. The success of GPT-4.5 verifies the long - term effectiveness of the Scaling Law.

Full Sharing ContentYouTube

📮Contribute ☘️Channel 🌸Chat

via Tech Circle🎗 Zaihua Channel📮 - Telegram Channel

•••