Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Hardware

Hugging Face Is Sharing $10 Million Worth of Compute To Help Beat the Big AI Companies (theverge.com) 10

Kylie Robison reports via The Verge: Hugging Face, one of the biggest names in machine learning, is committing $10 million in free shared GPUs to help developers create new AI technologies. The goal is to help small developers, academics, and startups counter the centralization of AI advancements. [...] Delangue is concerned about AI startups' ability to compete with the tech giants. Most significant advancements in artificial intelligence -- like GPT-4, the algorithms behind Google Search, and Tesla's Full Self-Driving system -- remain hidden within the confines of major tech companies. Not only are these corporations financially incentivized to keep their models proprietary, but with billions of dollars at their disposal for computational resources, they can compound those gains and race ahead of competitors, making it impossible for startups to keep up. Hugging Face aims to make state-of-the-art AI technologies accessible to everyone, not just the tech giants. [...]

Access to compute poses a significant challenge to constructing large language models, often favoring companies like OpenAI and Anthropic, which secure deals with cloud providers for substantial computing resources. Hugging Face aims to level the playing field by donating these shared GPUs to the community through a new program called ZeroGPU. The shared GPUs are accessible to multiple users or applications concurrently, eliminating the need for each user or application to have a dedicated GPU. ZeroGPU will be available via Hugging Face's Spaces, a hosting platform for publishing apps, which has over 300,000 AI demos created so far on CPU or paid GPU, according to the company.

Access to the shared GPUs is determined by usage, so if a portion of the GPU capacity is not actively utilized, that capacity becomes available for use by someone else. This makes them cost-effective, energy-efficient, and ideal for community-wide utilization. ZeroGPU uses Nvidia A100 GPU devices to power this operation -- which offer about half the computation speed of the popular and more expensive H100s. "It's very difficult to get enough GPUs from the main cloud providers, and the way to get them -- which is creating a high barrier to entry -- is to commit on very big numbers for long periods of times," Delangue said. Typically, a company would commit to a cloud provider like Amazon Web Services for one or more years to secure GPU resources. This arrangement disadvantages small companies, indie developers, and academics who build on a small scale and can't predict if their projects will gain traction. Regardless of usage, they still have to pay for the GPUs. "It's also a prediction nightmare to know how many GPUs and what kind of budget you need," Delangue said.

This discussion has been archived. No new comments can be posted.

Hugging Face Is Sharing $10 Million Worth of Compute To Help Beat the Big AI Companies

Comments Filter:

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...