The app for independent voices

Last week, Hugging Face CTO Julien Chaumond made a massive announcement.

And it will significantly impact how we build and ship AI products:

๐—›๐˜‚๐—ด๐—ด๐—ถ๐—ป๐—ด ๐—™๐—ฎ๐—ฐ๐—ฒ ๐—บ๐—ผ๐—ฑ๐—ฒ๐—น๐˜€ ๐—ฎ๐—ฟ๐—ฒ ๐—ฐ๐—ผ๐—บ๐—ถ๐—ป๐—ด ๐—ป๐—ฎ๐˜๐—ถ๐˜ƒ๐—ฒ๐—น๐˜† ๐˜๐—ผ ๐—š๐—ผ๐—ผ๐—ด๐—น๐—ฒ ๐—ฉ๐—ฒ๐—ฟ๐˜๐—ฒ๐˜… ๐—”๐—œ.

This is bigger than most people realize...

Today, millions of developers rely on Hugging Face models.

... and millions rely on Google Cloud for production infrastructure.

Until now, moving models between the two hasnโ€™t been smooth.

Slow downloads, awkward integrations, no direct caching, and limited TPU support.

Everyone doing this at scale has felt the pain.

This partnership changes that.

Hereโ€™s what it means in practice:

๐Ÿญ/ ๐—™๐—ฎ๐˜€๐˜๐—ฒ๐—ฟ ๐—บ๐—ผ๐—ฑ๐—ฒ๐—น & ๐—ฑ๐—ฎ๐˜๐—ฎ๐˜€๐—ฒ๐˜ ๐—ฎ๐—ฐ๐—ฐ๐—ฒ๐˜€๐˜€

Hugging Face repos will now cache directly on Google Cloud.

This means dramatically faster upload + download times through Vertex AI and GKE.

If you work with large models, this alone is a huge operational win.

๐Ÿฎ/ ๐—ก๐—ฎ๐˜๐—ถ๐˜ƒ๐—ฒ ๐—ง๐—ฃ๐—จ ๐˜€๐˜‚๐—ฝ๐—ฝ๐—ผ๐—ฟ๐˜ ๐—ณ๐—ผ๐—ฟ ๐—ผ๐—ฝ๐—ฒ๐—ป ๐—บ๐—ผ๐—ฑ๐—ฒ๐—น๐˜€

Youโ€™ll be able to run Hugging Face models on TPUs without the hacky glue code.

This brings serious performance gains for fine-tuning and inference at scale.

๐Ÿฏ/ ๐—” ๐˜€๐—ฎ๐—ณ๐—ฒ๐—ฟ ๐—ฑ๐—ฒ๐—ฝ๐—น๐—ผ๐˜†๐—บ๐—ฒ๐—ป๐˜ ๐—ฝ๐—ถ๐—ฝ๐—ฒ๐—น๐—ถ๐—ป๐—ฒ

Vertex AIโ€™s built-in security capabilities (including VirusTotal) will apply directly to Hugging Face assets.

This reduces supply chain risk โ†’ something teams rarely talk about, but absolutely should.

๐Ÿฐ/ ๐—Ÿ๐—ผ๐˜„๐—ฒ๐—ฟ ๐—ผ๐˜ƒ๐—ฒ๐—ฟ๐—ฎ๐—น๐—น ๐—”๐—œ ๐—ถ๐—ป๐—ณ๐—ฟ๐—ฎ ๐—ฐ๐—ผ๐˜€๐˜๐˜€

Julien mentioned that over 1,500 TB move between Hugging Face and Google Cloud daily.

Thatโ€™s already over $1B in cloud spend per year.

Optimizing that pipeline = major cost savings for teams working with open models.

๐—ก๐—ผ๐˜„ ๐˜‡๐—ผ๐—ผ๐—บ ๐—ผ๐˜‚๐˜ ๐—ณ๐—ผ๐—ฟ ๐—ฎ ๐˜€๐—ฒ๐—ฐ๐—ผ๐—ป๐—ฑ...

The AI ecosystem is clearly moving in one direction:

More open-source.

More interoperability.

More production-grade support for non-proprietary models.

And this partnership is a giant step in that direction.

If your team uses Vertex AI, TPUs, or large open models, this is one of the most impactful updates of the year.

What do you think of the new partnership?

Image Credit: Julien Chaumond

Nov 25
at
1:46 PM
Relevant people

Log in or sign up

Join the most interesting and insightful discussions.