American information technology company Meta has made a massive investment of $30 billion in NVIDIA GPUs for AI training, as revealed by Meta’s AI head, Yann LeCun. LeCun mentioned upcoming variations of Llama-3 during fine-tuning and training at the AI Summit.
Meta procured 500,000 additional GPUs, totaling a million and valued at $30 billion. LeCun emphasized computational limitations and GPU costs as hampering progress. Despite Meta’s massive GPU use, Sam Altman from OpenAI plans to spend $50 billion annually on AGI development, using 720,000 NVIDIA H100 GPUs costing $21.6 billion.Â
This investment surpasses the Apollo space program’s costs, highlighting escalating expenses in AI development. OpenAI and other tech giants are also investing heavily in GPUs, aiming to boost AI capabilities.
Microsoft targets 1.8 million GPUs by year-end, while OpenAI plans for 10 million. NVIDIA continues GPU production, supporting AI advancements. LeCun stressed the need to scale learning algorithms across GPUs efficiently for cost reduction.
As advancements continue, AI companies may lower costs while sustaining demand. The race for AI superiority drives significant investments, shaping the future of technology.
Also Read: META Platforms Introduces Llama 3 and Image Generator