According to Yann LeCun, head of AI at Meta, an American information technology business, the company has invested a whopping $30 billion on NVIDIA GPUs for AI training.
LeCun highlighted how advancement is hampered by GPU costs and computational constraints. Even if Meta uses a lot of GPUs, Sam Altman from OpenAI intends to use 720,000 NVIDIA H100 GPUs, which cost $21.6 billion, to develop AGI for $50 billion a year.
At the AI Summit, LeCun discussed future iterations of Llama-3 that are now being trained and fine-tuned. Meta acquired an additional half a million GPUs, for a total estimated worth of $30 billion.
Read also: Meta Stock Tanks After Earnings Beat