Written by 12:04 am AI, Latest news, OpenAI, Technology

### Pursuing Billions: OpenAI CEO Sam Altman’s Ongoing Quest for AI Chip Development

ChatGPT needs chips.

Bloomberg reports that Sam Altman is currently in discussions to secure funding for a ‘global’ network of manufacturers dedicated to producing hardware for artificial intelligence.

A recent article from Bloomberg reveals that Sam Altman, the CEO of OpenAI once again, is seeking to raise significant funds for an AI chip project aimed at establishing an extensive “network of fabrication facilities” worldwide. This initiative would involve collaboration with undisclosed top-tier chip producers.

One of the primary challenges and expenses associated with operating AI models lies in ensuring an adequate supply of chips capable of handling the computational demands of technologies like ChatGPT and DALL-E, which are instrumental in responding to prompts and generating images. Nvidia’s market value surpassed $1 trillion for the first time last year, partly attributed to its near-monopoly status, particularly as models such as GPT-4, Gemini, and Llama 2 heavily rely on its widely popular H100 GPUs.

Consequently, the competition to manufacture more powerful chips for running intricate AI systems has grown fiercer. The scarcity of fabrication facilities capable of producing high-end chips is prompting Altman and others to secure production capacity well in advance to meet the demand for these new chips. Competing with industry giants like Apple necessitates the backing of deep-pocketed investors willing to cover expenses that the nonprofit organization OpenAI is currently unable to bear. Reports suggest that SoftBank Group and the AI-focused company G42 based in Abu Dhabi are in discussions regarding fundraising for Altman’s venture.

A photograph showcasing the GH200 AI processor platform, featuring two dies mounted on a circuit board

![Image](Picture of the GH200 AI processor platform, showing two dies on a circuit board

In parallel, various companies involved in AI model development are delving into chip manufacturing. Microsoft, an investor in OpenAI, disclosed in November its development of a custom AI chip for model training, closely followed by Amazon’s announcement of an updated Trainium chip. Google’s chip design division leverages its DeepMind AI on Google Cloud servers to craft AI processors like the Tensor Processing Units (TPU).

Notably, AWS, Azure, and Google all utilize Nvidia’s H100 processors. Mark Zuckerberg, the CEO of Meta, indicated to The Verge journalist Alex Heath that “Meta will possess over 340,000 of Nvidia’s H100 GPUs by the year’s end” as part of the company’s pursuit of artificial general intelligence (AGI) advancement.

Nvidia has already unveiled its next-generation GH200 Grace Hopper chips to reinforce its leadership in the field, while competitors such as AMD, Qualcomm, and Intel have introduced processors tailored for empowering AI models on laptops, smartphones, and other devices.

Visited 2 times, 1 visit(s) today
Tags: , , , Last modified: March 27, 2024
Close Search Window
Close