Hugging Face to Invest $10 Million in Shared GPUs for Free Use by Developers and Researchers
Hugging Face, known for hosting a vast array of open-source and accessible artificial intelligence models, is also actively engaged in machine learning and AI research, and is currently profitable or nearing profitability.
Following a successful fundraising round of 235 million, valuing the company million,valuingthecompanyat 4.5 billion, Hugging Face is set to invest $10 million to create a shared GPU cluster. This cluster will be made available for free use (subject to application and approval), aiming to assist small developers, researchers, or AI startups in overcoming the centralization barriers hindering the development of AI technology.
Unlike Baidu, Hugging Face, along with most companies, believes that open-source and accessible AI technologies can foster industry growth. Proprietary AI technology does not represent the future Hugging Face envisions.
Therefore, the company is willing to invest in a shared GPU cluster. Based on the principle of shared usage, no GPU will go to waste, meaning the cluster could potentially operate at full capacity 24/7, offering support to developers and startups.
For small developers and AI startups, accessing the computational power of GPUs on a shared platform is challenging due to the high costs, which often require prepayment or monthly settlements, unlike larger customers who may settle annually.
This presents a significant financial burden for developers, as the cost of training AI models can be astronomically high, which is detrimental to the development of the AI industry.
Hugging Face states that the usage of shared GPUs will depend on actual demand. If a portion of the GPU capacity is underutilized, it can be made available to others. Thus, the shared GPU cluster is cost-effective and energy-efficient, making it highly suitable for community use.
The shared GPU cluster for Hugging Face is provided by the application hosting platform ZeroGPU, which will offer a cluster composed of NVIDIA A100 AI accelerators. Although the A100's performance is only 50% of that of the H100 accelerator, given that it is shared and free, this should not pose a significant issue for developers.